Now the numbers of errors, for we give up crawling a web site in case you know, they've got some kind of anti scraping technology or the web sites just completely knackered or something thirty is generally fine. Limit number of pages crawled per website most the time you want that ticked. I have it about a hundred thousand less you are going to be crawling some particularly big websites. I like that there, because, if you have an endless, crawl and there's some kind of weird URL structure on a website like an old-school date picker or something you don't want to be endlessly stuck on that website. Show your URLs being crawled. Now, if you just want to see what it's doing, you can have it on for debugging sometimes, but I generally leave it off because it makes it slightly faster and writes results into a text file so as it goes along and it finds expired domains as well as showing you and the GUI here, you can write them into a text file just in case there's a crash or your PC shuts down or something like that.
From a most basic perspective, any website or platform on the internet that allows people to post a link which goes to an outside site is a potential source of traffic. However, traffic sources are not created equal. Some traffic sources are very problematic. For example, if you were to go to social media sites and just share content and links, chances are, somebody at some time will object to that practice. They would say you’re spamming. It's very easy to get your affiliate account shut down because people complain. Similarly, you could be blogging for what seems like forever and fail to get much love from search engines.
Contains, Starts, Ends You can use the search bar to search for domains that contain particular keywords you are seeking, e.g. a search for "car" will yield results for all expired domain names that contain the term "car". You can also use the search tool to find domains that begin, or end with the search term too, with the drop down box next to the search form.
What I like to do is sort by DP which stands for domain pop, and this is basically the number of linking root domains. So, BL is the number of back links. As you know that can be somewhat misleading if they have a lot of site wide links or multiple links from the same domain. I like to sort by domain pop. What that does is it brings up the sites with the most amount of referring domains.
Here's a test I have in process. I bought an old $5 closeout domain from Godaddy TDname expired auction. I put a quick minisite up and linked to it with a crazy anchor text phrase. The domain is ranking for that crazy term now. It's gone through one pagerank update and I'm waiting for a second to come. Then I'll redirect the domain to another minisite. I suspect the second site won't rank for the anhor text, but we'll see.
The whitelist filter for wanted characters is now split into three filters. The old whitelist is now named whitelist (only). When used, the domain has to consist only of the defined characters or numbers. No other characters are allowed. The new whitelist (any) filter finds domains that have at least one of the defined characters or numbers. The new whitelist (all) filter finds domains that have all defined characters or numbers at least once.