Build MVP - Email Scraper (search a term then emails are returned)
$250-750 USD
Pago na entrega
Hello!
I'm looking to build a simple 4 page MVP for a new startup. It will work as follows:
1.) User enters search terms & can choose to scrape emails from Twitter, Facebook, Search Engines, Instagram or All from a dropdown.
2.) After user has entered their search term and selected the search location (Twitter, Facebook, Search Engines, Instagram or All), the web app will scrape the website for emails related to the search term (need your suggestions for the best ways to do this).
3.) These results are displayed in a table. The table should display the name of the person (if applicable), and their email. The user can then remove emails and download the CSV. 80 results displayed per page.
That's it. Keeping it super simple. I will provide the front-end HTML/CSS/Javascript layout files, you just need to do the backend (with some form of admin for viewing users).
This needs to be SECURE and easy to scale. Code should be clean and selected freelancers must have prior experience of building web apps.
Pages:
1.) Home page
2.) Step 1 (search box + dropdown)
3.) Step 2 (results displayed, download CSV & remove entries)
4.) Signup page (take email + password)
No login is required, but I would like to restrict access to the tool with a password overlay that I can give to MVP testers to test the general functionality.
Please bid accordingly, looking to start this weekend.
Thanks!
ID do Projeto: #6682241
Sobre o projeto
6 freelancers estão ofertando em média $858 nesse trabalho
Dear Sir, I have read your Project Description & understand the overall scope of work but in terms of layout/design, can you show me few reference/example websites URL? As this will give me much better idea about yo Mais
Hello, I have been working with RoR for over 3 years. I believe it makes me right candidate to do the job.
I guarantee all of my work if you aren't satisfied then you don't have to pay me. It'll take two weeks probably just because you have to use 4 different API's to scrape the data from