Intro

Tracking Google Search Rankings for different keywords can be a useful way to monitor the visibility of your business. If you know where your domain ranks in the search results for different keywords, you can use this information to improve your SEO. Checking this manually for a long list of keywords or search queries can be time consuming and inefficient.

Today we are going to learn how to write a Node.js script that will gather Search rankings for your domain and a keyword collection, then output that data to a .csv file.

For this project we will use the new SerpApi for JavaScript/TypeScript Node module.We will also use a module called objects-to-csv to export our data to a .csv file, dotenv for handling our environment variables, and delay to give searches time to finish.

If you do not need an explanation you can access the full code on Github.

Prerequisites

You will need to be somewhat familiar with basic Javascript. You will have an easier time if you are familiar with ES6 syntax and features, as well as Node and Npm.

You need to have the latest versions of Node and Npm installed on your system. If you don’t already have Node and Npm installed you can visit the following link for help with this process:

https://docs.npmjs.com/downloading-and-installing-node-js-and-npm

You need to have an IDE installed on your system that supports Javascript and Node.js. I recommend VSCode or Sublime, but any IDE that supports Node.js will do.

You will also need to sign up for a free Serpapi account at https://serpapi.com/users/sign_up.

Preparation

First you will need to create a package.json file. Open a terminal window, create a directory for the project, and CD into the directory.

mkdir track-google-rankings
cd track-google-rankings

Create your package.json file:

npm init

Npm will walk you through the process of creating a package.json file.
After this we need to make a few changes to the package.json file.
We will be using the ES Module system rather than CommonJs, so you need to add the line type="module" to your package.json.

{
  "name": "track-google-search-rankings-nodejs",
  "version": "1.0.0",
  "description": "Track Google Search Rankings with NodeJs and SerpApi",
  "type": "module",
  "main": "index.js",

We will also add a start script for convenience:

  "scripts": {
    "start": "node index.js",
    "test": "echo \"Error: no test specified\" && exit 1"
  },

Next we need to install our dependencies:

npm install serpapi dotenv delay objects-to-csv

If you haven’t already signed up for a free Serpapi account go ahead and do that now by visiting https://serpapi.com/users/sign_up and completing the signup process.

Once you have signed up, verified your email, and selected a plan, navigate to https://serpapi.com/manage-api-key . Click the Clipboard icon to copy your API key.

Then create a new file in your project directory called ‘.env’ and add the following line:

SERPAPI_KEY = “PASTE_YOUR_API_KEY_HERE”

Scraping Google Search Results with SerpApi

To get the data we need to check our rankings, we will use SerpApi. SerpApi is a web-scraping API that streamlines the process of scraping data from search engines. This can also be done manually, but SerpApi provides several distinct advantages. Besides providing real time data in a convenient JSON format, SerpApi covers proxies, captcha solvers and other necessary techniques for avoiding searches being blocked.  You can click here for a comparison of the process of scraping Google Search Results manually vs using SerpApi, or here for an overview of the techniques you can use to prevent getting blocked.

We’re ready now to start coding. Create a file called ‘index.js’ and open it with your IDE.

At the top of index.js add the following import statements:

import { config, getJson, getJsonBySearchId } from "serpapi";
import * as dotenv from "dotenv";

Then we need to add the following lines to configure the serpapi and dotenv models:

dotenv.config();
config.api_key = process.env.SERPAPI_KEY;
config.timeout = 60000;

We will also define a few constants. We need an array of keywords and a domain to use throughout the project. Feel free to change these to anything that suits your purposes. You may want to include some keywords that you expect your domain to rank highly for, as well as some that won’t produce a direct number one hit.

// the keyword combinations we will check our rankings for
const keywords = [
  "serpapi",
  "serp api",
  "Google Search API",
  "Google Search Results API",
  "Google Organic Results API",
  "google serp api",
  "serps api",
  "Google Search Engine API",
  "Google Local Pack API ",
  "Knowledge Graph API",
  "Google Results API",
  "Search Results API",
  "Google News Results API",
  "Google News API",
  "News Results API",
];


//   the domain to check the rankings of
const domain = "https://serpapi.com";

First, we will query SerpApi for Google Search Results, passing the name of our business as the search query:

async function keywordSearch(keyword) {
  // The parameters we will include in the the GET request
  const params = {
    q: keyword,
    location: "Austin, Texas, United States",
    google_domain: "google.com",
    gl: "us",
    hl: "en",
    engine: "google",
    num: 10,
    start: 0,
  };


  // here we call the API and wait for it to return the data
  const data = await getJson("google", params);
  return data;
}

You can change the location parameters (  location, google_domain,  gl , and hl) to any valid value you like. You can also change num to any multiple of 10 up to 100, if you want to get the exact rankings for domain/keyword pairings that might be below the 10th result. You can learn more about the parameters used with the Google Search Results API here.

Now we will print out the results in the terminal, so we can get a look at the data that is returned.

const results = await keywordSearch("serpapi")
console.log(results);

Scroll down in your terminal window to where it says organic_results. This is the part we are interested in.

You can see in the above example that the first hit is a site with title: “SerpApi: Google Search API” and displayed_url: “https://serpapi.com This means that for the keyword “serpapi” the domain “serpapi.com” has a rank of 1.

Checking a Ranking

To check the rank programmatically, we need to write some code that will loop through the results, and stop when it finds the domain we are looking for:

async function getRanking(data) {
  let keyword = data.search_parameters.q;
  let rank = 1;
  // loop through the organic results untill we find one that matches the domain we are looking for
  while (
    !data.organic_results[rank - 1]["displayed_link"].includes(domain) &&
    rank < data.organic_results.length
  ) {
    rank++;
  }
  // If the rank is greater than or equal to the number of results, we didn't find the domain with this query
  rank = rank < data.organic_results.length ? rank : "N/A";


  return { keyword, rank };
}

In the condition for the while loop we check whether the displayed_link  contains our domain name. Depending on what you are looking for, you could also compare titles and check for a direct match.

Multiple Keyword Combinations

If we just wanted to check one keyword, we wouldn’t need to write a script. We will now add an array of keyword combinations to query, and a method to loop through them:

// iterate over the list of keywords and get the ranking for each one.
async function getAllRankings(keywords) {
  const rankings = Promise.all(
    keywords.map(async (keyword) => {
        const result = await keywordSearch(keyword);
        return getRanking(result, keyword);
    }
  ))
  return rankings;
}

Each call to getRanking() returns a Promise. We use Promise.all to wrap our array of rankings in a Promise that will only be resolved when all of the Promises it contains are resolved. That will allow us to use .then() on the rankings array we return, as seen below:

getAllRankings(keywords).then((rankings) => console.log(rankings));

Exporting to .CSV

Let's export the data we’ve collected to a useful format. For this we will use the ‘objects-to-csv’ module you installed earlier. So far, we’ve written our script to store the data as an array of objects. The reason for this is that the objects-to-csv module allows us to input an array of objects and get back CSV.

First add the import statement:

import ObjectsToCsv from "objects-to-csv";

Then add the following function:

async function rankingsToCsv() {
  getAllRankings(keywords).then((rankings) =>
    new ObjectsToCsv(rankings).toDisk("./test.csv")
  );
}

If you run the function:

rankingsToCsv(keywords);

You should see a file called 'test.csv' appear in your project directory. If you open it up, it should look like this:

keyword,rank
serpapi,1
serp api,1
Google Search API,2
Google Search Results API,1
Google Organic Results API,1
google serp api,1
serps api,1
Google Search Engine API,2
Google Local Pack API ,1
Knowledge Graph API,4
Google Results API,1
Search Results API,1
Google News Results API,1
Google News API,N/A
News Results API,4

Using async=true

For this example we have only used a very short list of keyword combinations to get rankings for. But what if you have 100 or more queries you want to check? It will take a very long time if we have to wait for each request to finish before starting the next. We can speed up the process by using SerpApi’s async=true parameter.

Add the following import statement at the top of your file:

import delay from "delay";

The code below should look familiar. It is basically the same process we use for a regular search query, only we have added async=true and we are only extracting the search_id from the response.

async function getSearchId(keyword) {
  const params = {
    q: keyword,
    location: "Austin, Texas, United States",
    google_domain: "google.com",
    gl: "us",
    hl: "en",
    engine: "google",
    num: 10,
    start: 0,
    async: true,
  };


  let data = await getJson("google", params);
  const { id } = data.search_metadata;
  return id;
}

Below we are just looping through all of the keywords in our keywords array and getting the search_id by calling the function we wrote above. We include the  delay() function from the delay module we imported in order to give the searches time to finish before we try to retrieve them.

async function getAllSearchIds(keywords) {
  const search_ids = [];
  keywords.forEach(async (keyword) => {
    const id = await getSearchId(keyword, "us");
    search_ids.push(id);
  });
  await delay(1000); 
  return search_ids;
}

The serpapi node module gives us getJsonBySearchId(), which makes it very easy for us to query the Search Archive and retrieve our search results using a search_id.

async function retrieveSearch(id){
    const data = await getJsonBySearchId(id);
    return data;
}

Finally we add a function that will loop through all of the ids, and get the ranking of our domain for each one.

async function getAllRankingsBySearchId(ids){
    const rankings = Promise.all(
        ids.map(async (id) => {
            const result = await retrieveSearch(id);
            return getRanking(result);
        }
      ))
      return rankings;
}

Now we can print out the rankings to make sure our code is working:

    const ids = await getAllSearchIds(keywords);
    getAllRankingsBySearchId(ids).then((rankings) => console.log(rankings));

And then we can define a function to output to .csv:

async function asyncRankingsToCsv() {
    const ids = await getAllSearchIds(keywords);
    getAllRankingsBySearchId(ids).then((rankings) =>
      new ObjectsToCsv(rankings).toDisk("./test.csv")
    );
  }

We then delete the “test.csv” file we created earlier, and call the function to make sure it works:

asyncRankingsToCsv();

If you did everything correctly, you should see test.csv in your project directory again.

CSV files are compatible with most spreadsheet applications, so you can now load the data into Google Sheets or Excel or similar.

Conclusion

We have seen how to scrape Google Search Organic Results and check the Google Search ranking of a domain for a custom list of keywords. You now have a template you can modify in any way you need - you might try adding a list of Locations and iterating through them for each keyword, or comparing multiple domains, or  adding logic to combine different keywords programmatically.

I hope you found the tutorial informative and easy to follow. If you have any questions, feel free to contact me at ryan@serpapi.com.

Full code on Github
Google Search API Documentation
SerpApi Playground