So, let's start off with the simple website list crawl. So settings for this is covered by the general crawl settings and these apply to all the other types of crawl as well, such as the Search crawl and the Endless cruel, so pretty simple really. Delay between each request to a website. One second, this is in seconds. Secondly, concurrent websites crawl how many websites you want to crawl at any one point in time and then how many threads will concurrently crawl per website. So that's ten, a crawl of ten websites at once, and each of those websites there's three different threads crawling. That's 30 concurrent connections you've got going.
Warehousing: I have no clue to a percentage of warehoused domains but I know it happens. Tucows even admitted it on my blog “I know you don’t like that we’re allowed to select expiring names for the Tucows Portfolio rather than letting them all go to auction or drop but that seems to be something we have to agree to disagree about.” ~ Ken Schafer (1st comment) http://www.dotweekly.com/could-you-explain-tucows
When a domain name that was previously registered expires, it goes through a grace period that allows the previous registrant one last chance to reclaim their domain before it becomes available to new buyers. If the previous registrant still has not renewed their domain after the 77 day deletion process, the domain name will officially expire and enter the Aftermarket.
Finally, just because you have an idea doesn’t mean that you have more rights to a domain name. That’s not the way the system works. For example, Microsoft owns more than 75,000 domain names. Surely they’re not using them all — but they were acquired through purchasing businesses, hand registering them for business ideas, defensively registering them, etc. And they have the right to do so, just like you do for any idea that they have yet to come up with…just like a real estate investor does for a plot of land they have no intention of building upon for the next decade…just like an art collector does who wraps a Picasso and puts it in their basement.
Backorders don’t take place if a domain name expires at a registrar that has an auction partner, unless no bids are placed at that auction partner prior to the expiration date. Backorders (when domains are “caught”) only happen when a domain name goes through the full lifecycle and expires, is removed from the registry’s database, and becomes available for new registration.
Referring domains for anchor phrases usually reveals similar information. What you really want to look out for is whether or not the site has been picked up by a spammer in the past. A lot of times these expired domains were dropped, picked up by a website owner who then tried to rank it for keywords like Viagra, Cialis and whatever. You obviously don’t want that kind of domain.
Okay, so let's run it. I'm just going to copy those in there to give it a bit of a kick start, but I'm on modern fiber UK fiber, so that might actually be too slow still, but yeah just trial and error with the amoount of crawlng threads, because it depends what kind of resources your PC has, and what kind of connection you're on. Okay, so we'll come back in a minute, oh by the way also, you can get the DA and the PA for each result for any of the crawl types. As long as you put in your MOZ key in the settings, so I just paused the video because I want to play around with the settings, because I haven't used this on my home fibre and I found on my UK fibre, these things worked pretty well but, like I said it will vary from connection to connection and machines to machine as well. So because I want some results to show you pretty quickly. I swapped it for website list and we're going to crawl some directories because they're always really good for finding tons of expired domains.
Adam Strong Ali Zandi Ammar Kubba Andrew Allemann Andrew Rosener buy domains cybersquatting domain name investing domain name review domain name sales domain name valuation Efty.com EMD Escrow.com Estibot.com exact match domain Founders Frank Schilling GAKT geo-domains Giuseppe Graziano Go Daddy GoDaddy.com Google ICANN lead generation Michael Berkens Michael Cyger Moniker Monte Cahn NameJet.com NamesCon.com negotiation new gTLDs Page Howe registrar Ron Jackson Sedo Sedo.com sell domains seo Shane Cultra trademark UDRP valuation
So, let's start off with the simple website list crawl. So settings for this is covered by the general crawl settings and these apply to all the other types of crawl as well, such as the Search crawl and the Endless cruel, so pretty simple really. Delay between each request to a website. One second, this is in seconds. Secondly, concurrent websites crawl how many websites you want to crawl at any one point in time and then how many threads will concurrently crawl per website. So that's ten, a crawl of ten websites at once, and each of those websites there's three different threads crawling. That's 30 concurrent connections you've got going.
Yes, I know, it seems backwards that you’d have to compete in an auction for a domain name that may never be caught. Other people feel the same way and many don’t compete as a result of that. But that’s the way the drop catching service was set up. Who knows, it may mean less competition for you since others do not like that model, which benefits the company rather than the user.
The way this list is implemented means I can't update prices, nor removed sold domains automatically and sedo doesn't provide me with the tools I need to implement it properly. Unfortunately some users keep contacting me to update their domain prices or remove their domains and I can't keep doing that manually. So I've decided to disable the list for now.
×