This is for a custom search, not a normal Google search. For example, if you owned abc.com and acme.com, you could set up a custom search on those two domains for your customers. That way, they could search your sites for information. The 5,000-site limit is actually huge. I'm not sure I can think of an application that would use that many specified sites.
I think what you are looking for is the Google Web Search API, which searched all of Google. Unfortunately, that is now depreciated. (reference: http://code.google.com/apis/websearch/) You can still use the old API, but it is a risk because Google reserves the rights to turn it off at any time. They will also limit the number of searches you perform per day (although I can't find a specific number for that limit). Here is a link to their terms: http://code.google.com/apis/websearch/terms.html
I would recommend looking at an API from another search engine if you really want to integrate it directly into your code. A different suggestion would be to put your search information behind an interface and code it to Google for now. Then if they turn it off or something better comes out, you could change just the search code to point to the newest and best API.
Answer from IAmTimCorey on Stack OverflowThis is for a custom search, not a normal Google search. For example, if you owned abc.com and acme.com, you could set up a custom search on those two domains for your customers. That way, they could search your sites for information. The 5,000-site limit is actually huge. I'm not sure I can think of an application that would use that many specified sites.
I think what you are looking for is the Google Web Search API, which searched all of Google. Unfortunately, that is now depreciated. (reference: http://code.google.com/apis/websearch/) You can still use the old API, but it is a risk because Google reserves the rights to turn it off at any time. They will also limit the number of searches you perform per day (although I can't find a specific number for that limit). Here is a link to their terms: http://code.google.com/apis/websearch/terms.html
I would recommend looking at an API from another search engine if you really want to integrate it directly into your code. A different suggestion would be to put your search information behind an interface and code it to Google for now. Then if they turn it off or something better comes out, you could change just the search code to point to the newest and best API.
Google Custom Search is actually capable of searching the entire web, although the setting is not obvious. See "Search the entire web".
The other problems you are likely to run into are:
- You only get 100 results per search, and
- You are limited to 100 queries per day.
Sadly, "upgrading" to Google Site Search eliminates problem #2 at the expense of being able to search the entire web.
Custom Search Engine Version Daily Usage Limits - Programmable Search Engine Community
Why people are paying for API to do google searches? Why people won't use simple Python scrapy/requests script instead of paying 200 dollars for 200k searches?
Google Search API - Result limit of 100 Results
How are the Custom Search JSON API Quota Limits Enforced? - Programmable Search Engine Community
Are there any rate limitations when using the SearchAPI?
Is the API capable of handling large-scale requests?
How fast does the SearchAPI deliver results?
Hi. I am going currently through different SaaS's things and I found business that is selling access to API to do google searches. And I don't understand who is paying for this.
First of all, I thought that google is selling access to their own search through their own API, but apparently it was deleted for some reason?
Out of curiosity, I checked out how hard it would be with Python and Scrapy and it is around 10 lines of code to get search response from google.
I am wondering what am I missing? I was thinking about 2 obvious(?) problems:
How well would Scrapy/Requests with let's say FastAPI scale. How slow it would be.
How much would google tolerate. I can only guess that if 1 ip will be making new search every 3 - 5 seconds non-stop. Google would ban this ip. So the more requests, the biggest IP pool would be needed?
But other than that, what am I missing? Why someone is paying 200 dollars for 200k google searches?