dev3lopcom, llc, official logo 12/8/2022

Connect Now

Send LinkedIn Data to Google BigQuery Using Node.js

Send LinkedIn Data to Google BigQuery Using Node.js

In order to export data from LinkedIn to Google BigQuery using Node.js, it is necessary to utilize both the LinkedIn API and the BigQuery API. This process can be broken down into the following high-level steps: first, register as a developer on the LinkedIn API platform and obtain an access token, which will allow for the authentication of requests made to the LinkedIn API and the retrieval of data from your LinkedIn account or a public LinkedIn account. Next, use the BigQuery API to create a new dataset and table within your BigQuery project, into which the data from LinkedIn can be loaded. To make use of the LinkedIn and BigQuery APIs, it will be necessary to install the required packages in your Node.js environment; for LinkedIn, the linkedin-sdk package can be utilized, and for BigQuery, the @google-cloud/bigquery package is recommended. Using the Node.js request module or a similar package, make HTTP requests to the LinkedIn API in order to retrieve the desired data, and then use the @google-cloud/bigquery package to authenticate requests to the BigQuery API and load the data into the previously created BigQuery table. Once the data is in BigQuery, it can be analyzed and manipulated as needed using SQL queries.

  1. First, you’ll need to register as a developer on the LinkedIn API platform and obtain an access token. You can use this access token to authenticate your requests to the LinkedIn API and retrieve data from your LinkedIn account or a public LinkedIn account.
  2. Once you have the data you want to export from LinkedIn, you can use the BigQuery API to create a new dataset and table in your BigQuery project. You can then use the API to load the data from LinkedIn into the table.
  3. To use the LinkedIn and BigQuery APIs, you’ll need to install the necessary packages in your Node.js environment. For the LinkedIn API, you can use the linkedin-sdk package. For the BigQuery API, you can use the @google-cloud/bigquery package.
  4. You can use the Node.js request module or a similar package to make HTTP requests to the LinkedIn API and retrieve the data you want to export. You can then use the @google-cloud/bigquery package to authenticate your requests to the BigQuery API and load the data into your BigQuery table.
  5. Once you have the data in BigQuery, you can use SQL queries to analyze and manipulate the data as needed.

Here is an example of how you could use the linkedin-sdk and @google-cloud/bigquery packages to export data from LinkedIn to Google BigQuery in Node.js:

const LinkedIn = require('linkedin-sdk');
const {BigQuery} = require('@google-cloud/bigquery');

async function exportData() {
  // Replace these values with your own
  const clientId = 'your_client_id';
  const clientSecret = 'your_client_secret';
  const accessToken = 'your_access_token';
  const projectId = 'your_project_id';
  const datasetId = 'your_dataset_id';
  const tableId = 'your_table_id';

  // Authenticate to LinkedIn and retrieve data
  const linkedin = new LinkedIn(clientId, clientSecret);
  linkedin.setAccessToken(accessToken);
  const data = await linkedin.people.asMember('~:(id,first-name,last-name)');

  // Initialize the BigQuery client
  const bigquery = new BigQuery({
    projectId: projectId
  });

  // Load the data into a BigQuery table
  const options = {
    schema: 'id:string,first_name:string,last_name:string',
    createDisposition: 'CREATE_IF_NEEDED',
    writeDisposition: 'WRITE_APPEND',
  };
  const [job] = await bigquery
    .dataset(datasetId)
    .table(tableId)
    .load(data, options);

  console.log(`Job ${job.id} completed.`);
}

exportData();

This code authenticates to LinkedIn using the linkedin-sdk package and retrieves data from the user’s profile. It then uses the @google-cloud/bigquery package to create a new table in a BigQuery dataset and load the data into the table.

Keep in mind that you’ll need to replace the placeholder values in the code with your own LinkedIn client ID, client secret, access token, and BigQuery project, dataset, and table IDs.

You’ll also need to ensure that you have the necessary packages installed and that you have set up authorization for the BigQuery API.

(Note: LinkedIn has changes their api often)

References;

Send Instagram Data to Google BigQuery Using Node.js

Send Instagram Data to Google BigQuery Using Node.js

Are you eager to start sending Instagram data to Google Big Query using Node.js and have not found snippets of code needed to connect the dots?

First, you’ll need to register as a developer on the Instagram API platform and obtain an access token. You can use this access token to authenticate your requests to the Instagram API and retrieve data from your Instagram account or a public Instagram account.

Once you have the data you want to export from Instagram, you can use the BigQuery API to create a new dataset and table in your BigQuery project. You can then use the API to load the data from Instagram into the table.

To use the Instagram and BigQuery APIs, you’ll need to install the necessary packages in your Node.js environment. For the Instagram API, you can use the instagram-private-api package. For the BigQuery API, you can use the @google-cloud/bigquery package.

You can use the Node.js request module or a similar package to make HTTP requests to the Instagram API and retrieve the data you want to export. You can then use the @google-cloud/bigquery package to authenticate your requests to the BigQuery API and load the data into your BigQuery table.

Once you have the data in BigQuery, you can use SQL queries to analyze and manipulate the data as needed.

Here is an example of how you could use the instagram-private-api and @google-cloud/bigquery packages to export data from Instagram to Google BigQuery in Node.js:

const InstagramPrivateAPI = require('instagram-private-api');
const {BigQuery} = require('@google-cloud/bigquery');

async function exportData() {
  // Replace these values with your own
  const username = 'your_username';
  const password = 'your_password';
  const projectId = 'your_project_id';
  const datasetId = 'your_dataset_id';
  const tableId = 'your_table_id';

  // Authenticate to Instagram and retrieve data
  const device = new InstagramPrivateAPI.Device(username);
  const storage = new InstagramPrivateAPI.CookieFileStorage(`${__dirname}/cookies/${username}.json`);
  const session = await InstagramPrivateAPI.Session.create(device, storage, username, password);

  // Use the Instagram API to retrieve data
  const feed = new InstagramPrivateAPI.Feed.AccountFollowers(session);
  const data = [];
  let page = feed.iterate();
  while (true) {
    const {value} = await page.next();
    if (!value) {
      break;
    }
    data.push(value);
  }

  // Initialize the BigQuery client
  const bigquery = new BigQuery({
    projectId: projectId
  });

  // Load the data into a BigQuery table
  const options = {
    schema: 'name:string,username:string,profile_picture:string',
    createDisposition: 'CREATE_IF_NEEDED',
    writeDisposition: 'WRITE_APPEND',
  };
  const [job] = await bigquery
    .dataset(datasetId)
    .table(tableId)
    .load(data, options);

  console.log(`Job ${job.id} completed.`);
}

exportData();

Your code authenticates to Instagram using the instagram-private-api package and retrieves data from the user’s followers. It then uses the @google-cloud/bigquery package to create a new table in a BigQuery dataset and load the data into the table.

Keep in mind that you’ll need to replace the placeholder values in the code with your own Instagram username, password, and BigQuery project, dataset, and table IDs. You’ll also need to ensure that you have the necessary packages installed and that you have set up authorization for the BigQuery API.

Send Facebook Data to Google BigQuery Using Node.js

Send Facebook Data to Google BigQuery Using Node.js

To transfer data from Facebook to Google BigQuery, you can use the Facebook Graph API to obtain the data and then utilize the Google Cloud API to load it into BigQuery. This is a general overview of the steps involved in this process:

  1. Create a Facebook developer account and obtain an access token that allows you to access the Facebook Graph API.
  2. Use the Facebook Graph API to retrieve the data you want to export. You can use the API’s /{object-id}/{connection-name} endpoint to retrieve data for a specific object, such as a user or a page, and its connections, such as posts or comments.
  3. Use the Google Cloud API to load the data into BigQuery. You can use the bq command-line tool or the BigQuery API to create a new table in BigQuery and load the data into it.

Here’s some example code using the request and google-auth-library libraries in Node.js to retrieve data from the Facebook Graph API and load it into BigQuery:

const request = require('request');
const { GoogleAuth } = require('google-auth-library');

async function exportData() {
  // Retrieve data from Facebook Graph API
  const response = await request({
    url: 'https://graph.facebook.com/v8.0/{object-id}/{connection-name}',
    qs: {
      access_token: '{access-token}',
      fields: '{fields}',
      limit: 100
    },
    json: true
  });

  // Load data into BigQuery
  const auth = new GoogleAuth();
  const client = await auth.getClient();
  const bigquery = await require('@google-cloud/bigquery')({
    projectId: '{project-id}',
    auth: client
  });

  const dataset = bigquery.dataset('{dataset-name}');
  const table = dataset.table('{table-name}');
  await table.insert(response.data);
}

exportData();

You’ll need to modify it to fit your specific use case.

For example, you may need to paginate through the results if you have more data than the API’s limit, and you’ll need to specify the correct object and connection names and fields for the data you want to retrieve.

You can find more information about the Facebook Graph API and the BigQuery API in the documentation linked below.

References:

Solution: The ability to connect to Google Sheets greater than 10 MB is currently not built into the product.

Solution: The ability to connect to Google Sheets greater than 10 MB is currently not built into the product.

When building a data source on Google Sheets in Tableau Desktop 10mb is the max per Google Sheet. However, what if we could connect to more than one Google Sheet at the same time?

Google Sheets works wonders with Tableau Public because it allows for Tableau Public to read data from Google Sheets once per day. This enables everyone the capability to use an online cloud data source to update their Tableau Desktop dashboards.

Introduction

In this blog, we will discuss connecting to multiple Google Sheets in one connection. If “large sheets removed” is not sufficient, and you’re willing to break apart your sheets into multiple sheets manually or with an engineer, we will find this article helpful. We break apart each element, how it works, and explain how your engineer may begin breaking down the solution.

Tableau currently has no built in feature to allow this to happen, however they do have a feature you can setup to make it automatically connect to Google Sheets! Tableau suggests this isn’t possible and the only way to make it work is to use LESS DATA, but what if you have big data?

We built this blog to help you begin the journey of connecting to many sheets. You will want to demo this as a possible solution to show your engineering team to automatically create these Google Sheets (we are here to help too).

Error explained

If you begin connecting to a Google Sheet in Tableau Desktop >10mb, you will see various popups, depending on your operating system, explaining an error has occurred with Google Sheets.

Unable to complete action error message

Did you recently see this error?

Unable to complete action
The Google Sheets service reported an unrecognized error when processing this request.
This file is too large to be exported.

An error occurred while communicating with Google Sheets, <10mb Google sheets error message in Tableau desktop.

A good question to start asking, “will my data split into other sheets easily?”

In the example below we are going to speak towards an 80mb Google Sheet that will not work in Tableau.

Tech Workaround explained

The Tableau Desktop Wildcard (automatic) feature will capture Google Workbooks (contains google sheets) and google sheets in a workbook(s). It will “automate” building connections to multiple 10mb workbooks or sheets, by establishing a stack of data that will resemble your end goals. Similar to using Union All in SQL.

Stone Working Analogy

If you’re unfamiliar with union in SQL, think of your data stacking on each other like bricks on a brick wall. Data engineers and brick wall experts have similar jobs. They are solving a problem by stacking side by side or on top of each other.

We will focus on a “workbooks” example, where we will stack bricks of data on each other.

Using a matching pattern(xxx*), we are able to union similar named data storage.

Example; there are four regions in our data, each region is <10mbs, about 80mb total.

Four regions:

  • SOUTH 10mb
  • NORTH 10mb
  • EAST 10 mb
  • WEST 50 mb*

A Use Case Explaining the Solution

Step 0; union means the sheets need the same column headers. Thanks for learning about unions!

Step 1; build 8 googlesheets… (new workbooks, not new sheets, this works with sheets however I’m using workbooks for now)

Step2; name each google sheet workbook “Table-Demo_EXAMPLE” etc… and you will have the following.

  • Table-Demo_SOUTH
  • Table-Demo_NORTH
  • Table-Demo_EAST
  • Table-Demo_WEST_1
  • Table-Demo_WEST_2
  • Table-Demo_WEST_3
  • Table-Demo_WEST_4
  • Table123-Demo_WEST_5

Protip; Table123-Demo_WEST_5 will not be included in this exercise because it’s not named Table-Demo_. Wildcard allows you the ability to filter to the things you need. If you name your Google Sheets “Table-Demo_” our wildcard solution automates connection to that google sheet, there’s no need to connect to the extra google sheet if you’re setting up the solution as explained.

Now that we have an understanding of how a wildcard will work, let’s discuss the end to end.

How to setup >10mb union

To increase size of google sheets greater than 10 megabytes, and increase your overall Google Sheets insights in tableau desktop, you need to get good with the Union wildcard!

Connect to the googlesheet. Tableau desktop made this workflow a one click button on left side of opening screen. Requiring two clicks in total.

Walk through the Google authentication, choose which email with many similar tables for wildcard. This means you need to go and change the names of the Tables you wish to put together.

The renaming part needs to be a part of an automated process, you may want to do, using the Google Sheets API also known as the Google API, we found success automatically creating a new Google Sheet, and automatically naming the sheet similarly, which improved a client engagement during a tableau consulting engagement that had a lot of data engineering consulting to generate the solution. If data is constantly morphing, there may be a need to delete old sheets, we found clearing the sheet and re-populating data was the easiest method for fresh cleans. However lets get focused on the manual process because it’s a similar architecture. We found naming tables differently between tests helped us with testing/troubleshooting, and found Google Sheets had some strange hiccups that are easier to avoid by removing old tests completely.

Discussing Data Structure tips for >10mb Google Sheets

Here’s a good time to start making sure column headers are the same. If not it will continue to make a column, which will lead you down the path of learning how to solve for dynamic parameters due to string values being many to many.

Convert to union…

Very important step, drop down carrot and find the Convert to union click.

This workaround allows you to connect once, to all sheets similarly named (using wild cards) VS connecting to all the different google sheet workbooks. This allows you to remove many data sources and transition into one data source.

The Wildcard Union Screen in Tableau

Tableau offers a feature to union more than one google sheet together, which enables users to quickly build large cloud data storages on Tableau public, or internally.

Example; Tables-Demo_* will find anything with Tables-Demo_ as the start of the sheet name.

Helpful related resources and documentation.

Below are documents, notes, and community posts from other Tableau authors.

  • https://community.tableau.com/s/question/0D54T00000C62YC/is-it-possible-to-union-google-sheets-from-different-workbooksconnections?t=1634788554971
  • https://community.tableau.com/s/question/0D54T00000C6d1DSAR/this-file-is-too-large-to-be-exported?_ga=2.151469113.1478915185.1634739545-1826838528.1627942705
  • https://community.tableau.com/s/question/0D54T00000C6gnP/google-spreadsheet-file-is-too-large-to-be-exported-error
  • https://community.tableau.com/s/question/0D54T00000WV6dySAD/tableau-couldnt-connect-to-google-sheet-data
  • https://community.tableau.com/s/question/0D54T00000G36TK/limits-to-know-for-tableau-public-google-sheets
  • https://community.tableau.com/s/question/0D54T00000CWeW0SAL/error-this-file-is-too-large-to-be-exported-when-extracting-data-from-google-sheet

The 10mb limit with Google Sheets is ambiguous when testing the number with true CSV file sizes and better to determine a way of “stopping” the data before it gets big.

Some interesting things to think through, we found 7mb, 10.3mb, 12.9mb, and 19.1mb CSV files coming from single Google Sheet connections and no popup error stopping Tableau Desktop from connecting to the data. Don’t consider this size to be your break/test.

Screenshot demonstrating various CSV files downloaded from Google Sheets – Tested Oct 28, 2021

Good to note; This is the size of the csv when downloaded via the Google Sheets/Data/. Your team may get hung up on this process, and we found it’s better to focus on a row count if you’re not using full stack engineering to complete the task.

Thank you for reading. If you are interesting in implementing a process that uses Google API, contact us to learn more.

Researched & Written by, Tyler Garrett founder of Dev3lop | Consulting Services.

4 Steps – How to Embed Google Data Studio in iFrame

4 Steps – How to Embed Google Data Studio in iFrame

If you need to embed Google Data Studio reports in an iFrame on your website, we have that solution here with screenshots below.


How to embed google data studio steps.

  1. Click File
  2. Click Embed report
  3. Click ‘copy to keyboard.’
  4. That’s it. Celebrate. You’re done. It’s free and there’s no monthly fees or subscription.
  5. Screenshots below!

The embed google data studio code – iframe

The code for embedding google data studio in an iframe is very easy.
<center><iframe style=”border: 0;” src=”http://XXXXXXX” width=”500″ height=”900″ frameborder=”0″ allowfullscreen=”allowfullscreen”></iframe></center>

Google Datas Studio Embed is packed with mobility.

Everyone is focused on reoccurring revenue and missed the functionality most users request. From easy embeds, to free ability to share across the organizations without a hefty pricing punch.

Google Data Studio has a lot of sharp swords in its disposal, for now lets focus on the mobility.

Squeeze down the browser to see how responsive everything is without any programming or clicks.

Google helps the world take a huge step in the right direction, as currently you have to build multiple iterations in Tableau Desktop – or your end users are stuck with a static mold or automatic sizing that doesn’t work for all devices because most users utilize large font sizes on their computer without even understanding what DPI settings are.

Hey, we only built this to offer a free solution because companies are trying to earn revenue off of this easy to do feature. Let us empower you to do this and you can save your bucks for another day.

Feel free to poke around after you finish embedding your google data studio report.

Embedding Google Data Studio Screenshots

Let us know if you need help. Advice is free! For solutions please see our business intelligence page!

Embed google data studio in two clicks, step 1

Embed google data studio clickthrough. It’s only two clicks away!

Embed google data studio in two clicks, step 2

Step 2 copy to clipboard! Embedding google data studio is too easy, thanks Google.

Submit Yoast Sitemap to Google on WordPress

Submit Yoast Sitemap to Google

Do you want robots figuring out what you want to rank for? No, you don’t! At Dev3lop we tested building a website without an XML sitemap implemented at Google WebMaster tools, it was laughable, taught us a lot, and really built a great use case for using a sitemap.

This wasn’t an entirely huge waste for us actually. The robot showed us a lot of clever word phrases we didn’t think of and expanded our digital marketing touch.

Did google search a Dev3lop Website without an XML Sitemap?

Yes, if you’re getting a lot of traffic, google will come and check out your website! At the time we were getting about 2000 page views per week, and when we started looking to check our queries in the search console, we noticed the robot ranked us sporadically across odd terms and phrases. My favorite was ‘father-in-law backup google drive.’

We did this to validated the necessity of the XML sitemap, rather than just doing it because some guy named Niel wrote a 10,000-word blog saying XML sitemap over and over.

Rather, we decided to build data around this strategy and then optimize our SEO around it.

It’s a lengthy process, the pay isn’t great unless you ranked for something promising, but if you’re savvy with data, it’s worth the time.

Why waste time to test XML sitemap validity when it’s a gold standard?

Do you jump off bridges because everyone else does?

After reading SEO blogs for a year straight, and sacrificing 20-30hrs a week to study and practice. I noticed one overwhelming trait, none of these big names have any data background or any technical experience other than being the person who blogged about it.

It was hard to imagine everyone was jumping on the XML sitemap bandwagon without fully explaining the impacts of not doing it. I’m sure everyone has heard the needle in the haystack comment and how you won’t get indexed if you don’t have the sitemap.

A lot of that rambling on is paid for writers building an SEO style blog, rather than explaining the good, bad, and limitations, they just dump content on content about things that everyone is saying.

The short and simple is, if you don’t submit your XML sitemap to Google, you’re a needle in a haystack and more than likely not optimized onsite for ranking.

If you’ve not submitted your sitemap yet, this article is perfect for you.

Why should I submit my Sitemap to google?

Submit sitemap to Google or else! Or else your letting robots read your site, and they don’t do it that great, and will likely rank you for nonsense.

When you want organic traffic, you need to consider jumping into a few different aspects of web technology. If you already have a website hosted, built and just need to get your sitemap over to google, congratulations! You’re about to take off into SEO space!

Are you ready to submit your XML sitemap to google?

Here’s how to get your XML sitemap open in Yoast, and then accepted by Google for ranking if your site is setup correctly!

  1. Host a website.
  2. Build a website, we use WordPress.
  3. Download an SEO app, we use Yoast.

    Submit Yoast Sitemap to Google Tutorial

    When you Submit Yoast Sitemap to Google, expect an increase in traffic!

  4. Turn Yoast advanced options on to ensure XML Sitemap functionality is visible in your WordPress side menu.
  5. Open XML Sitemap in WordPress side panel, under SEO, if you don’t see it, repeat step 4.

    Submit Yoast Sitemap to Google Tutorial

    Submit Yoast Sitemap to Google or let a robot read your website, maybe.

  6. Enable the XML sitemap functionality on Yoast, missing this step?

    Submit Yoast Sitemap to Google Tutorial

    Click XML Sitemap before you submit your Yoast Sitemap to Google.

  7. Click XML Sitemap and copy your URL. Be sure check it before asking google to see it. If it’s not there when you click it, and you get your 404 message. This is Okay and normal. If you can click on XML Sitemap, and you’re not seeing the correct information, it’s a common thing to need to fix it if it’s not currently turned on.
  8. In you sidebar, open settings, permalink, scroll to the bottom and just click save. That should do it.

    Submit Yoast Sitemap to Google Tutorial save changes

    WordPress and Yoast work flawlessly once you click Save Changes. Just navigate to permalinks and click save changes!

     

  9. Submit your sitemap by pasting it into Google Webmaster Console aka Search Console.

 

Time to Submit Yoast Sitemap to Google, right?!

Start getting ranked today, right?!

If you have content on your website and have actively been blogging about material that adds value to your customer, it’s a good thing to consider asking Google to come look at your website.

Before you Submit Yoast Sitemap to Google

Before you submit your XML sitemap, it’s really important to understand something about the content on your website. You won’t get ranked with what you have on your website, sorry to say that! Especially if you don’t have a single blog.

Text Headers are important, images are important, and how you optimize each of them together is critical. If these are great, you’re about 5% there in a competitive market. We highly recommend testing your SEO strategies in a keyword phrase marketplace that is not competitive.

For example, trying to rank in SEO is not possible because everyone is 10+ years ahead of you!

We haven’t been blogging?! Woops.

If you’ve put off adding blogs or content, now it’s time to leave the ego at the door, and start sharing what you do with your customers. Otherwise, net authority is not going to be noticeable to any experienced technology professional.

Google has an appetite for lots of content, not minimal simple pages, and with content comes ranking and authority! Although it may look cool, it’s not going to increase your organic traffic.

Stop that minimal stuff, start blogging, get over that fear.

Although it may look cool, it’s not going to increase your organic traffic.

Lastly, regarding: Submit Yoast Sitemap to Google

Regrettably, if you’ve ignored your family, friends, or business partner and never considered picking up a finger to blog, then your website won’t rank very well. You will not be able to expand to a global scale outside of your direct shares, social media outreach, or emailing people for business.

This turns your website into a responsive business, and google is simply offering you the ability to be strategic.

Time to smell the SEO roses, and get ranked like a true expert or use our SEO Experts.