Hello
We have a strong experience in development of web crawlers/spiders and will be able to help you with this project. Let me explain few things:
1) Whole system will contain server side (where database, webservice for data exchange and frontend for users will be) and client modules that will parse search engines for the specific words (major one like Bing, Google, Yahoo). Having several client modules running parsing is much better in terms of getting banned by search engines
2) Client module (that will be Windows application) will utilize Internet Explorer and will simulate normal user activity (e.g. delays in requests that are regular for human). It will reduce chance to be banned
3) Client module will be capable of showing user the captcha if search engine will ask for it
4) List of keywords/search phrases ("words/numbers" in your description) will be stored at server and each client module will be using one item from this item at one time for parsing. We will implement smart locking mechanics for it (thanks God we already used it)
5) Data exchange protocol is SOAP over HTTP (ASP.NET webservices)
6) Frontend will be implemented using ASP.NET (regular one, not MVC as it is not necessary here at all). There will be two roles there - user and admin. Admin will be able to create user logins. ActiveDirectory integration possible as well, let me know if you will need it
7) I think you will need search results for some kind of analytics, that's why I would suggest to store URL of every result (to access it later if you will need) and cache HTML of that URL. It will give you full control on information
I have some more questions (for example about HTTPS request you were writing about), that's why it would be great to discuss them in PMB.
Denis