The project is a webcrawler written in NodeJS by myself. It searches through hundreds of pages by use of proxy hosts.
The problem is that the application has two bugs: there is a connection error that I cannot seem to catch correctly. Also, after about 15/20 minutes, the of about 165 responses (this is about the amount of pages of the website that is being crawled) the crawler stops. This might have the same cause.
I don't have the time to fix this myself, so I am looking for a senior NodeJS developer to fix it.
If I am satisfied with he result there will probably be a lot more work to come
15 freelancers estão ofertando em média €173 para esse trabalho
Hello, What is your tech stack? Are you using PhantomJS or puppeteer? I'd be interested in taking a close look at your source. Do contact me when available. Mateen
Okay. Hi. I can do work your job perfectly because I am a good senior node developer. I have a good experience about webcrawler. I'll wait with you in future. Bye.
PAY ME ONLY IF I CAN FIX IT I have built several scrappers which are highly scalable. I HAVE 3+ YEARS OF NODEJS EXPERIENCE GET IN TOUCH: Telegram : diensh3836 Skype : deinsh3836 @ [login to view URL]
I can help with this problem, i have plenty of experience working with asyncronous enviorements and functions, also with webservices and nodejs, some time ago I solved a memory leak from a uber like API