dev3lopcom, llc, official logo 12/8/2022

Connect Now

In order to export data from LinkedIn to Google BigQuery using Node.js, it is necessary to utilize both the LinkedIn API and the BigQuery API. This process can be broken down into the following high-level steps: first, register as a developer on the LinkedIn API platform and obtain an access token, which will allow for the authentication of requests made to the LinkedIn API and the retrieval of data from your LinkedIn account or a public LinkedIn account. Next, use the BigQuery API to create a new dataset and table within your BigQuery project, into which the data from LinkedIn can be loaded. To make use of the LinkedIn and BigQuery APIs, it will be necessary to install the required packages in your Node.js environment; for LinkedIn, the linkedin-sdk package can be utilized, and for BigQuery, the @google-cloud/bigquery package is recommended. Using the Node.js request module or a similar package, make HTTP requests to the LinkedIn API in order to retrieve the desired data, and then use the @google-cloud/bigquery package to authenticate requests to the BigQuery API and load the data into the previously created BigQuery table. Once the data is in BigQuery, it can be analyzed and manipulated as needed using SQL queries.

  1. First, you’ll need to register as a developer on the LinkedIn API platform and obtain an access token. You can use this access token to authenticate your requests to the LinkedIn API and retrieve data from your LinkedIn account or a public LinkedIn account.
  2. Once you have the data you want to export from LinkedIn, you can use the BigQuery API to create a new dataset and table in your BigQuery project. You can then use the API to load the data from LinkedIn into the table.
  3. To use the LinkedIn and BigQuery APIs, you’ll need to install the necessary packages in your Node.js environment. For the LinkedIn API, you can use the linkedin-sdk package. For the BigQuery API, you can use the @google-cloud/bigquery package.
  4. You can use the Node.js request module or a similar package to make HTTP requests to the LinkedIn API and retrieve the data you want to export. You can then use the @google-cloud/bigquery package to authenticate your requests to the BigQuery API and load the data into your BigQuery table.
  5. Once you have the data in BigQuery, you can use SQL queries to analyze and manipulate the data as needed.

Here is an example of how you could use the linkedin-sdk and @google-cloud/bigquery packages to export data from LinkedIn to Google BigQuery in Node.js:

const LinkedIn = require('linkedin-sdk');
const {BigQuery} = require('@google-cloud/bigquery');

async function exportData() {
  // Replace these values with your own
  const clientId = 'your_client_id';
  const clientSecret = 'your_client_secret';
  const accessToken = 'your_access_token';
  const projectId = 'your_project_id';
  const datasetId = 'your_dataset_id';
  const tableId = 'your_table_id';

  // Authenticate to LinkedIn and retrieve data
  const linkedin = new LinkedIn(clientId, clientSecret);
  linkedin.setAccessToken(accessToken);
  const data = await linkedin.people.asMember('~:(id,first-name,last-name)');

  // Initialize the BigQuery client
  const bigquery = new BigQuery({
    projectId: projectId
  });

  // Load the data into a BigQuery table
  const options = {
    schema: 'id:string,first_name:string,last_name:string',
    createDisposition: 'CREATE_IF_NEEDED',
    writeDisposition: 'WRITE_APPEND',
  };
  const [job] = await bigquery
    .dataset(datasetId)
    .table(tableId)
    .load(data, options);

  console.log(`Job ${job.id} completed.`);
}

exportData();

This code authenticates to LinkedIn using the linkedin-sdk package and retrieves data from the user’s profile. It then uses the @google-cloud/bigquery package to create a new table in a BigQuery dataset and load the data into the table.

Keep in mind that you’ll need to replace the placeholder values in the code with your own LinkedIn client ID, client secret, access token, and BigQuery project, dataset, and table IDs.

You’ll also need to ensure that you have the necessary packages installed and that you have set up authorization for the BigQuery API.

(Note: LinkedIn has changes their api often)

References;