Start integration with your preferred language.

Get started for free

Code to integrate

Google Search Results in Ruby

serpapi-search-ruby Gem Version

This Ruby Gem is meant to scrape and parse results from Google, Bing, Baidu, Yandex, Yahoo, Ebay and more using SerpApi.

The following services are provided: provides a script builder to get you started quickly.


Modern Ruby must be already installed:

$ gem install google_search_results

Link to the gem page

Tested Ruby versions:

  • 2.5
  • 3.0
  • 3.1
  • 3.2

See: GitHub Actions.

Quick start

require 'google_search_results'
search = "coffee", serp_api_key: "secret_api_key")
hash_results = search.get_hash

This example runs a search about "coffee" using your secret api key.

The service (backend)

  • searches on Google using the search: q = "coffee"
  • parses the messy HTML responses
  • return a standardizes JSON response The class GoogleSearch
  • Format the request to server
  • Execute GET http request
  • Parse JSON into Ruby Hash using JSON standard library provided by Ruby Et voila..

Alternatively, you can search:

  • Bing using BingSearch class
  • Baidu using BaiduSearch class
  • Yahoo using YahooSearch class
  • Yandex using YandexSearch class
  • Ebay using EbaySearch class
  • Home depot using HomeDepotSearch class
  • Youtube using YoutubeSearch class

See the playground to generate your code.



How to set the private API key

The api_key can be set globally using a singleton pattern.

GoogleSearch.api_key = "secret_api_key"
search = "coffee")

or api_key can be provided for each search.

search = "coffee", api_key: "secret_api_key")

To get the key simply copy/paste from

Search API capability for Google

search_params = {
  q: "search",
  google_domain: "Google Domain",
  location: "Location Requested",
  device: "desktop|mobile|tablet",
  hl: "Google UI Language",
  gl: "Google Country",
  safe: "Safe Search Flag",
  num: "Number of Results",
  start: "Pagination Offset",
  api_key: "private key", # copy paste from
  tbm: "nws|isch|shop",
  tbs: "custom to be search criteria",
  async: true|false # allow async

# define the search search
search =

# override an existing parameter
search.params[:location] = "Portland,Oregon,United States"

# search format return as raw html
html_results = search.get_html

# search format returns a Hash
hash_results = search.get_hash

# search as raw JSON format
json_results = search.get_json

(the full documentation)[].

More search API are documented on

You will find more hands on examples below.

Example by specification

We love true open source, continuous integration and Test Drive Development (TDD). We are using RSpec to test our infrastructure around the clock to achieve the best QoS (Quality Of Service).

The directory test/ includes specification/examples.

Set your api key.

export API_KEY="your secret key"

Install RSpec

gem install rspec

To run the test:

rspec test

or if you prefers Rake

rake test

Location API

location_list = "Austin", limit: 3).get_location
pp location_list

it prints the first 3 location matching Austin (Texas, Texas, Rochester)

    id: "585069bdee19ad271e9bc072",
    google_id: 200635,
    google_parent_id: 21176,
    name: "Austin, TX",
    canonical_name: "Austin,TX,Texas,United States",
    country_code: "US",
    target_type: "DMA Region",
    reach: 5560000,
    gps: [-97.7430608, 30.267153],
    keys: ["austin", "tx", "texas", "united", "states"]

Search Archive API

This API allows to retrieve previous search. To do so run a search to save a search_id.

search = "Coffee", location: "Portland")
original_search = search.get_hash
search_id = original_search[:search_metadata][:id]

Now let retrieve the previous search from the archive.

search =
archive_search = search.get_search_archive(search_id)
pp archive_search

it prints the search from the archive.

Account API

search =
pp search.get_account

it prints your account information.

Search Google Images

search = 'cofffe', tbm: "isch")
image_results_list = search.get_hash[:images_results]
image_results_list.each do |image_result|
  puts image_result[:original]

To download the image: wget #{image_result[:original]}

this code prints all the images links, and download image if you un-comment the line with wget (linux/osx tool to download image).

Search Google News

search ={
  q: 'cofffe', # search search
  tbm: "nws", # news
  tbs: "qdr:d", # last 24h
  num: 10

3.times do |offset|
  search.params[:start] = offset * 10
  news_results_list = search.get_hash[:news_results]
  news_results_list.each do |news_result|
    puts "#{news_result[:position] + offset * 10} - #{news_result[:title]}"

this script prints the first 3 pages of the news title for the last 24h.

Search Google Shopping

search ={
  q: 'cofffe', # search search
  tbm: "shop", # shopping
  tbs: "tbs=p_ord:rv" # by best review
shopping_results_list = search.get_hash[:shopping_results]
shopping_results_list.each do |shopping_result|
  puts "#{shopping_result[:position]} - #{shopping_result[:title]}"

This script prints all the shopping results order by review order with position.

Google Search By Location

With, we can build Google search from anywhere in the world. This code is looking for the best coffee shop per city.

["new york", "paris", "berlin"].each do |city|
    # get location from the city name
    location ={q: city, limit: 1}).get_location.first[:canonical_name]

    # get top result
    search ={
      q: 'best coffee shop',
      location: location,
      num: 1,  # number of result
      start: 0 # offset
    top_result = search.get_hash[:organic_results].first

    puts "top coffee result for #{location} is: #{top_result[:title]}"

Batch Asynchronous search

We do offer two ways to boost your searches thanks to async parameter.

  • Non-blocking - async=true (recommended)
  • Blocking - async=false - it's more compute intensive because the search would need to hold many connections.
company_list = %w(microsoft apple nvidia)

puts "submit batch of asynchronous searches"
search ={async: true})

search_queue =
company_list.each do |company|
  # set search
  search.params[:q] = company

  # store request into a search_queue - no-blocker
  result = search.get_hash()
  if result[:search_metadata][:status] =~ /Cached|Success/
    puts "#{company}: search done"

  # add result to the search queue

puts "wait until all searches are cached or success"
search =
while !search_queue.empty?
  result = search_queue.pop
  # extract search id
  search_id = result[:search_metadata][:id]

  # retrieve search from the archive - blocker
  search_archived = search.get_search_archive(search_id)
  if search_archived[:search_metadata][:status] =~ /Cached|Success/
    puts "#{search_archived[:search_parameters][:q]}: search done"

  # add result to the search queue

puts 'all searches completed'

This code shows a simple implementation to run a batch of asynchronously searches.

Supported search engine

Google search API

GoogleSearch.api_key = ""
search = "Coffee", location: "Portland")
pp search.get_hash

Bing search API

BingSearch.api_key = ""
search = "Coffee", location: "Portland")
pp search.get_hash

Baidu search API

BaiduSearch.api_key = ""
search = "Coffee")
pp search.get_hash

Yahoo search API

YahooSearch.api_key = ""
search = "Coffee")
pp search.get_hash

Yandex search API

YandexSearch.api_key = ""
search = "Coffee")
pp search.get_hash

Ebay search API

EbaySearch.api_key = ""
search = "Coffee")
pp search.get_hash

Youtube search API

YoutubeySearch.api_key = ""
search = "Coffee")
pp search.get_hash

Homedepot search API

HomedepotSearch.api_key = ""
search = "Coffee")
pp search.get_hash

Walmart search API

WalmartSearch.api_key = ""
search = "Coffee")
pp search.get_hash

Duckduckgo search API

DuckduckgoSearch.api_key = ""
search = "Coffee")
pp search.get_hash

Naver search API

search = "Coffee", api_key: "secretApiKey")
pp search.get_hash

Apple store search API

search = "Coffee", , api_key: "secretApiKey")
pp search.get_hash

Generic SerpApi search

SerpApiSearch.api_key = ENV['API_KEY']
query = {
  p: "Coffee",
  engine: "youtube"
search =
hash = search.get_hash
pp hash[:organic_results]

see: google-search-results-ruby/test/search_api_spec.rb

Error management

This library follows the regular raise an exception when something goes wrong provided by Ruby. Any networking related exception will be returned as is. Anything related to the client layer will be returned as a SerpApiException. A SerpApiException might be caused by a bug in the library. A networking problem will be caused by either or your internet.

Change log

  • 2.2
    • add apple store search engine
    • add naver search engine
  • 2.1 - Add more search engine: Youtube, Duckduckgo, Homedepot, Walmart
    • improve error management and documentation.
  • 2.0 - API simplified( GoogleSearchResults -> GoogleSearch), fix gem issue with 2.6+ Ruby, Out Of Box step to verify the package before delivery.
  • 1.3.2 - rename variable client to search for naming consistency
  • 1.3 - support for all major search engine
  • 1.2 - stable versino to support goole and few more search engine
  • 1.1 - client connection improvement to allow multi threading and fiber support
  • 1.0 - first stable version with Google engine search with Google image


  • 2.1 Improve exception / HTTP status handling


SerpApi supports all the major search engines. Google has the more advance support with all the major services available: Images, News, Shopping and more.. To enable a type of search, the field tbm (to be matched) must be set to:

  • isch: Google Images API.
  • nws: Google News API.
  • shop: Google Shopping API.
  • any other Google service should work out of the box.
  • (no tbm parameter): regular Google search.

The field tbs allows to customize the search even more.

The full documentation is available here.


Contributions are welcome, feel to submit a pull request!

To run the tests:

export API_KEY="your api key"
rake test

Free Plan · 100 searches / month

Get started

They trust us

You are in good company. Join them.

Morgan Stanley
Thomson Reuters
Bright Local