I basically want a program to run on my 4gb or 8gb ram computer that will crawl a list of 200,000 urls to find all external broken links so that I can find potential high authority domains I can register. I'd need to have a variable search depth, at least 2 pages deep to find enough domains. Even if 200k urls finds 20 million internal/external links, it must still run without dying and freezing. This is the problem with most programs such as xenu, screaming frog, and several others, none of them can handle a large scale of scanning. I want BIG
It must check for domain availability, prob using an api such as namecheap.
It must then check metrics for each available domain with a www. and without.
It must gather metrics such as PA/DA/CF/TF/cftf ratio/backlinks/mozrank using api's
It must not freeze
It must be consistent
It must be fast
It may need to save automatically as well
Continued maintenance or support would be appreciated as well.
Here's some examples of similar expired domain finders:
[url removed, login to view]
xenu - this is slow and freezes everytime it reaches 2 million found links
screaming frog - this is slow and freezes and runs out of memory
expired domain miner - this is twice as fast as xenu but not at all good enough.
[url removed, login to view] - this is slow, expensive, freezes, and is dead atm
[url removed, login to view] - this is dead atm
Let me know if you know of some software to crawl tons of urls for broken links and prices or if you can build one yourself.
10 freelancers estão ofertando em média $246 para esse trabalho
Hello, We have excellent team of programmers and designers to work on your project efficiently and complete job in time. We have read your deepest requirement at our best and will surely give better results. thanks