Introduction

Local SEO is crucial for businesses looking to improve their visibility and conversions in their specific locations. In this blog post, we will delve into the concept of local SERP, the importance of local SEO, and the use of UULE parameter for accurate local search results. We will also explore an example grid search code for better search engine results precision using GPS coordinates of a location. The example search term we use will be Cafe, and we will gather only the results on the landing page of the search. You may scroll to the bottom of the page to get the full code.

We will use the following to scale the search volume with precision:

What is Local SERP?

A Local Search Engine Results Page (SERP) refers to the search results displayed by search engines when a user conducts a search query with local intent. These results are tailored to the user's geographical location and often include local business listings, maps, and reviews. Local SERPs help users find relevant, nearby businesses or services while enabling businesses to reach potential customers effectively.

local-result-example

What is meant by Local SEO?

Local SEO, or Local Search Engine Optimization, is the practice of optimizing a website, its content, and its online presence to improve visibility in local search results. Local SEO targets potential customers in a specific geographical area by focusing on local search signals such as business listings, reviews, and citations. The goal of local SEO is to increase organic traffic, conversions, and online visibility for businesses with a physical location or service area. SerpApi offers a variety of Local SEO tools including but not limited to Google Local Results API, Google Local Pack Results API, Google Maps API, Google Local Services API, etc.

References:

What are the benefits of Local SERP?

Local SERPs offer numerous benefits for businesses, including:

  1. Improved Visibility: Local SERPs provide targeted exposure for businesses by displaying their information to users in the relevant geographical area.

  2. Increased Conversions: By targeting users with location-specific intent, local SERPs can drive higher conversion rates.

  3. Enhanced Reputation: Local search results often include reviews and ratings, which can help businesses build trust and credibility with potential customers.

  4. Competitive Edge: Optimizing for local SERPs can give businesses a competitive advantage over competitors who neglect local SEO.

  5. Better User Experience: Local SERPs provide users with relevant, location-specific information, resulting in a more satisfying search experience.

What is the definition of the word "uule"?

UULE (Universal URL Encoded Location) is a URL parameter utilized by Google to pinpoint a user's specific location during a search. This parameter is encoded with a geolocation identifier, known as a canonical name, which represents a particular geographical area, or by GPS coordinates. By incorporating the UULE parameter in search queries, businesses can obtain precise local search results tailored to their target audience's location.

There are two different UULE parameter versions:

Version 1: w+CAIQICI…

This version uses a "canonical name," typically used by Google AdWords to geotarget ads. An example of a canonical name is "West New York, New Jersey, United States," which includes the name of the place, region, and country. Decoding this version requires handling the entire string as a URI component, decoding it, splitting the string by the "plus," and base64-decoding the second part of the string (starting with "CAIQICI").

Version 2: a+cm9…

This version can be found in the cookie written by Google after obtaining the location via the Geolocation API on the website. This version requires no guessing of fields and uses text instead of protocol buffers. After URL-decoding the string and base64-decoding it, you will get ASCII text containing information such as role, producer, provenance, timestamp, latitude, longitude, and radius.

You can employ the uule parameter to get precise results using most of SerpApi’s Google Serp Engines to get real-time organic results. We have developed a ruby gem called uule_converter to convert GPS Coordinates into uule Version 2 as described above. The advantage of this uule is that you don’t have to know the Canonical Names of the locations you are gathering data from. You can employ a grid search with a checker for your local rankings to filter them by their SERP features such as an address, or GPS coordinate of a local result, or their metrics such as rating, number of reviews, etc.

playground-example

References:

Example Serp Checker Code for a GPS Grid

Possible Integrations

The example code given here is using SerpApi’s Google Search Results Ruby Gem. However, you can use this gem to gather. You may use this gem, or the method therein to gather UULE parameters for a grid search, and then use them using SerpApi’s integrations in Different Languages. You may also make real-time organic searches for mobile devices to expand your keyword research in a specified location and enrich your rank tracker with pagination.

References:

Required Gems and Variables

require 'uule_converter'
require 'geocoder'
require 'json'
require 'fileutils'
require 'concurrent'
require 'rest-client'
require 'google_search_results'

We will be needing the above gems to construct our searcher.

# Set your API key
api_key = 'Your API Key'

We’ll also need the SerpApi API Key. You may register to claim free credits.

# Set the Country of the Place you want to search
address = "Liechtenstein"

Let’s set the address to a small country in Europe called Liechtenstein. I chose this country because of its size. It is surrounded by Austria and Switzerland. Later we will exclude results from those countries, and use only the cafes in Liechtenstein in our results.

# Set the Search Language
search_language = "en"

Search Language has no effect on the ordering of results for this case. We will set it to English for better display.

# Set the preference for grid search
grid_search = false

I added a Boolean variable here just in case you just want to filter your results with the example code.

# Get geocoded address object
address_geocoded = Geocoder.search(address).first

# Define the country code based on the geocoded address
desired_country = address_geocoded.country_code

We will use the Geocoder gem to get relevant data from the address we have given. Distinguishing the country will be important for filtering neighboring countries and making the search more precise in the future.

if grid_search
  # Define address's latitude and longitude boundaries
  bounds = address_geocoded.data['boundingbox'].map(&:to_f)
  south, north, west, east = bounds[0].round(6), bounds[1].round(6), bounds[2].round(6), bounds[3].round(6)

  # Grid step size (approximately 1 km)
  step_size = 0.009

Define the GPS boundaries of a location to create a grid layout. In our case, it will be the farthest point of each cardinal direction of the country Liechtenstein. The step_size variable defines how much distance we must step at each interval in order to cover an area. The uule we will use will give all the cafes in an approximately 1 km radius circle of the GPS coordinates we locked.

  # Create 'searches' directory if it doesn't exist
  FileUtils.mkdir_p('searches')

We will save each search inside the searches directory to be reunited later. This way, long grid searches will have a chance to be recovered in case of abruptions in your system.

  # Create a thread pool
  thread_pool = Concurrent::FixedThreadPool.new(10)

  # Create a concurrent counter for request numbering
  request_counter = Concurrent::AtomicFixnum.new(0)

  # Grid search
  puts "Starting grid search..."
  (south..north).step(step_size).each do |latitude|
    (west..east).step(step_size).each do |longitude|
      thread_pool.post do
        uule_encoded = UuleConverter.encode(latitude, longitude)

        #gl is set to the country code based on the geocoded address
        gl_parameter = desired_country.downcase
        #hl=en, Language parameter fixed for English
        hl_parameter = search_language
        
        #no_cache=true to get live results

        # Here, I have added a domain for extra precision
        # It is optional for the most part.
        google_domain = "google.li"

        # SerpApi request
        search = GoogleSearch.new(engine: "google_local", google_domain: google_domain, q: "Cafe", gl: gl_parameter, hl: hl_parameter, uule: uule_encoded, serp_api_key: api_key)
        hash_results = search.get_hash

        # You can get deeper results by following the `serpapi_pagination` -> `next_page`
        # To paginate into deeper results. Some results will correlate with the others.
        # It is later filtered out.

        # Save response to a JSON file
        request_number = request_counter.increment
        File.open("searches/response_#{request_number}.json", 'w') { |f| f.write(JSON.pretty_generate(hash_results)) }
      end
    end
  end

  # Shutdown the thread pool and wait for the tasks to complete
  thread_pool.shutdown
  thread_pool.wait_for_termination
end

We will iterate through each GPS coordinate, make a search using google_search_results gem, and then save it into a local file inside searches folder.

One thing to note here is that I have used only the first page results for demonstration purposes. Some of the results have second and more pages. You may refactor this code to implement pagination results as well.

Another detail here is that I have implemented the gl parameter to be the same with Liechtenstein, which is li. This ensures the locales of the search are in order. For extra precision, I have also implemented the google_domain parameter to be the same with the country.

Moreover, you can see that I have made parallel searches. This is a fast way to gather responses from SerpApi. There is also another way to make searches that is even faster, async searches.

You can find detailed instructions on how to use them in the following references. Although they are in Python, it is quite easy to implement them into any language.

References:

Filtering Results

# If you want to skip the grid search and filter results directly
if grid_search == false
  folder_path = "searches"
  file_pattern = File.join(folder_path, "*") # get all files in folder
  file_count = Dir.glob(file_pattern).length
  request_counter = file_count
else
  request_counter = request_counter.value
end

Let’s define the number of files we want to go through to filter results.

# Read searches from local storage
puts "Reading searches from local storage..."
cafes = []
collected_place_ids = []
(1..request_counter).each do |i|
  search_data = JSON.parse(File.read("searches/response_#{i}.json")) rescue nil
  next unless search_data

  if search_data['local_results']
    search_data['local_results'].each do |result|
      unless collected_place_ids.include?(result['place_id'])
        cafes.push(result)
        collected_place_ids.push(result['place_id'])
      end
    end
  end
end

The next step is to collect the unique local results. Since we target a circle 1 km in radius, we need to ensure that the excerpt areas in between must be scanned by the grid as well. For that reason, we will always have common results between searches.

# Find the neighbouring countries
class CountryNeighbours
  def self.get_neighbours(country_name)
    country_name = country_name.split(" ").join("%20")
    response = RestClient.get("https://restcountries.com/v3.1/name/#{country_name}?fullText=true")
    parsed_response = JSON.parse(response.body)
    
    if parsed_response.empty?
      return "Could not find country: #{country_name}"
    end
    
    neighbours = parsed_response[0]['borders']
    
    if neighbours.empty?
      return "#{parsed_response[0]['name']['official']} has no neighbors."
    end
    
    neighbour_names = []
    
    neighbours.each do |neighbour_code|
      neighbour_response = RestClient.get("https://restcountries.com/v3.1/alpha/#{neighbour_code}")
      parsed_neighbour_response = JSON.parse(neighbour_response.body)
      neighbour_names << parsed_neighbour_response[0]['name']['common']
    end
    
    puts "#{parsed_response[0]['name']['official']}'s neighbors: #{neighbour_names.join(", ")}"
    neighbour_names
  end
end

Let's define a class to find neighbouring countries.

# Create a thread pool
thread_pool = Concurrent::FixedThreadPool.new(10)

# Create an empty array to store filtered cafes
filtered_cafes = Concurrent::Array.new

# Neigbouring countries
neighbors = CountryNeighbours.get_neighbours(address_geocoded.data["address"]["country"])

# Filter cafes in neighbouring countries in their address
puts "Filtering cafes in neighbouring countries in their address..."
cafes.each do |cafe|
  thread_pool.post do
    if cafe['address']
      neighbouring_country = false
      neighbors.each do |country|
        neighbouring_country = true if cafe['address'].include?(", #{country}")
      end
      next if neighbouring_country

      filtered_cafes.push(cafe)
      puts "Added cafe: #{cafe['title']} at #{cafe['address']}"
      # You may use this part to further filter your data using GPS Coordinates
      #if cafe.key?('gps_coordinates') && cafe['gps_coordinates'].key?('latitude') && cafe['gps_coordinates'].key?('longitude')
      #  coordinates = [cafe['gps_coordinates']['latitude'], cafe['gps_coordinates']['longitude']]
      #  country = Geocoder.search(coordinates).first.country_code
      #  if country == desired_country
      #    filtered_cafes.push(cafe)
      #    puts "Added cafe: #{cafe['title']} at #{cafe['address']}"
      #  end
      #end
    end
  end
end

# Shutdown the thread pool and wait for the tasks to complete
thread_pool.shutdown
thread_pool.wait_for_termination

Since Liechtenstein is small in size, and surrounded by Austria, and Switzerland, we will have lots of results from these countries as well. We can filter them out from their address field. Also, this script could be improved by checking the GPS coordinates of places individually for their country.

# Reindex positions
filtered_cafes = filtered_cafes.each_with_index { |cafe, index|  cafe["position"] = index + 1 }

# Save filtered cafes to a JSON file
File.open('filtered_results.json', 'w') { |f| f.write(JSON.pretty_generate(filtered_cafes)) }

Finally, save the filtered results into a resulting JSON file.

Results

[
  {
    "position": 2,
    "title": "Georg Brot & Kaffee",
    "rating": 4.5,
    "reviews_original": "(86)",
    "reviews": 86,
    "price": "$$",
    "type": "Cafe",
    "address": "Eschen",
    "place_id": "15376372137427994719",
    "place_id_search": "https://serpapi.com/search.json?device=desktop&engine=google_local&gl=li&google_domain=google.li&hl=en&ludocid=15376372137427994719&q=Cafe&uule=a%2Bcm9sZToxCnByb2R1Y2VyOjEyCnByb3ZlbmFuY2U6MAp0aW1lc3RhbXA6MTY4MTIxNjI0MTIwNjExMgpsYXRsbmd7CmxhdGl0dWRlX2U3OjQ3MDQ4NDI5MApsb25naXR1ZGVfZTc6OTQ5ODY3NDAKfQpyYWRpdXM6LTEK",
    "lsig": "AB86z5UWdQN7B3xNWKgYU4wVHFey",
    "thumbnail": "https://serpapi.com/searches/643552f21e803f1d09648e64/images/f98bc385e043920a6d2147a9e986a289c30fda633fea7e587c4423723afebe06.jpeg",
    "gps_coordinates": {
      "latitude": 47.2124433,
      "longitude": 9.523155500000001
    },
    "service_options": {
      "dine_in": true,
      "takeaway": true
    }
  },
  ...
]

With a simple grid search, we can gather all the cafes that show up on the first page of Google Local Results in a small country. It has detailed fields for different uses to make inferences as to what ranks a place better than others. Moreover, if you have a local service that a cafe can use, you can gather all the relevant data about cafes around you. There are many use cases for local SEO as I mentioned above.

I am grateful for the attention of the reader. I hope this blog post will give you a good idea about how you can effectively utilize Local SERP data.

Full Code

require 'uule_converter'
require 'geocoder'
require 'json'
require 'fileutils'
require 'concurrent'
require 'rest-client'
require 'google_search_results'

# Set your API key
api_key = 'Your API Key'

# Set the Country of the Place you want to search
address = "Liechtenstein"

# Set the Search Language
search_language = "en"

# Set the preference for grid search
grid_search = false

# Get geocoded address object
address_geocoded = Geocoder.search(address).first

# Define the country code based on the geocoded address
desired_country = address_geocoded.country_code

if grid_search
  # Define address's latitude and longitude boundaries
  bounds = address_geocoded.data['boundingbox'].map(&:to_f)
  south, north, west, east = bounds[0].round(6), bounds[1].round(6), bounds[2].round(6), bounds[3].round(6)

  # Grid step size (approximately 1 km)
  step_size = 0.009

  # Create 'searches' directory if it doesn't exist
  FileUtils.mkdir_p('searches')

  # Create a thread pool
  thread_pool = Concurrent::FixedThreadPool.new(10)

  # Create a concurrent counter for request numbering
  request_counter = Concurrent::AtomicFixnum.new(0)

  # Grid search
  puts "Starting grid search..."
  (south..north).step(step_size).each do |latitude|
    (west..east).step(step_size).each do |longitude|
      thread_pool.post do
        uule_encoded = UuleConverter.encode(latitude, longitude)

        #gl is set to the country code based on the geocoded address
        gl_parameter = desired_country.downcase
        #hl=en, Language parameter fixed for English
        hl_parameter = search_language
        
        #no_cache=true to get live results

        # Here, I have added a domain for extra precision
        # It is optional for the most part.
        google_domain = "google.li"

        # SerpApi request
        search = GoogleSearch.new(engine: "google_local", google_domain: google_domain, q: "Cafe", gl: gl_parameter, hl: hl_parameter, uule: uule_encoded, serp_api_key: api_key)
        hash_results = search.get_hash

        # You can get deeper results by following the `serpapi_pagination` -> `next_page`
        # To paginate into deeper results. Some results will correlate with the others.
        # It is later filtered out.

        # Save response to a JSON file
        request_number = request_counter.increment
        File.open("searches/response_#{request_number}.json", 'w') { |f| f.write(JSON.pretty_generate(hash_results)) }
      end
    end
  end

  # Shutdown the thread pool and wait for the tasks to complete
  thread_pool.shutdown
  thread_pool.wait_for_termination
end

# Find the neighbouring countries
class CountryNeighbours
  def self.get_neighbours(country_name)
    country_name = country_name.split(" ").join("%20")
    response = RestClient.get("https://restcountries.com/v3.1/name/#{country_name}?fullText=true")
    parsed_response = JSON.parse(response.body)
    
    if parsed_response.empty?
      return "Could not find country: #{country_name}"
    end
    
    neighbours = parsed_response[0]['borders']
    
    if neighbours.empty?
      return "#{parsed_response[0]['name']['official']} has no neighbors."
    end
    
    neighbour_names = []
    
    neighbours.each do |neighbour_code|
      neighbour_response = RestClient.get("https://restcountries.com/v3.1/alpha/#{neighbour_code}")
      parsed_neighbour_response = JSON.parse(neighbour_response.body)
      neighbour_names << parsed_neighbour_response[0]['name']['common']
    end
    
    puts "#{parsed_response[0]['name']['official']}'s neighbors: #{neighbour_names.join(", ")}"
    neighbour_names
  end
end

# If you want to skip the grid search and filter results directly
if grid_search == false
  folder_path = "searches"
  file_pattern = File.join(folder_path, "*") # get all files in folder
  file_count = Dir.glob(file_pattern).length
  request_counter = file_count
else
  request_counter = request_counter.value
end


# Read searches from local storage
puts "Reading searches from local storage..."
cafes = []
collected_place_ids = []
(1..request_counter).each do |i|
  search_data = JSON.parse(File.read("searches/response_#{i}.json")) rescue nil
  next unless search_data

  if search_data['local_results']
    search_data['local_results'].each do |result|
      unless collected_place_ids.include?(result['place_id'])
        cafes.push(result)
        collected_place_ids.push(result['place_id'])
      end
    end
  end
end

# Create a thread pool
thread_pool = Concurrent::FixedThreadPool.new(10)

# Create an empty array to store filtered cafes
filtered_cafes = Concurrent::Array.new

# Neigbouring countries
neighbors = CountryNeighbours.get_neighbours(address_geocoded.data["address"]["country"])

# Filter cafes in neighbouring countries in their address
puts "Filtering cafes in neighbouring countries in their address..."
cafes.each do |cafe|
  thread_pool.post do
    if cafe['address']
      neighbouring_country = false
      neighbors.each do |country|
        neighbouring_country = true if cafe['address'].include?(", #{country}")
      end
      next if neighbouring_country

      filtered_cafes.push(cafe)
      puts "Added cafe: #{cafe['title']} at #{cafe['address']}"
      # You may use this part to further filter your data using GPS Coordinates
      #if cafe.key?('gps_coordinates') && cafe['gps_coordinates'].key?('latitude') && cafe['gps_coordinates'].key?('longitude')
      #  coordinates = [cafe['gps_coordinates']['latitude'], cafe['gps_coordinates']['longitude']]
      #  country = Geocoder.search(coordinates).first.country_code
      #  if country == desired_country
      #    filtered_cafes.push(cafe)
      #    puts "Added cafe: #{cafe['title']} at #{cafe['address']}"
      #  end
      #end
    end
  end
end

# Shutdown the thread pool and wait for the tasks to complete
thread_pool.shutdown
thread_pool.wait_for_termination

# Reindex positions
filtered_cafes = filtered_cafes.each_with_index { |cafe, index|  cafe["position"] = index + 1 }

# Save filtered cafes to a JSON file
File.open('filtered_results.json', 'w') { |f| f.write(JSON.pretty_generate(filtered_cafes)) }