Competitor Social Monitoring Using Social Media Scrapers

Staying ahead of business competition involves more than delivering a great product or service—it involves understanding how competitors are perceived in the market. In order to keep tabs on your competitors, you should continuously monitor them on social media platforms and other platforms where they’re located. By monitoring your competitors’ reviews, you can gain a deeper understanding of their strengths and weaknesses. This can help you discover potential growth areas for your product or business and generate leads.

In this tutorial, we’ll learn how to scrape competitor reviews from social media platforms using The Social Proxy’s social media Scraper API. We’ll use the APIs to enrich the data you collect, providing deeper reviewer insights that you can leverage in your sales and lead generation strategy.

Why is monitoring competitor reviews important?

Competitor review monitoring is the process of tracking, gathering, and analyzing the reviews customers leave about your competitors’ products or services. This process involves collecting data from various review platforms such as Facebook, LinkedIn, Instagram, Google Reviews, or industry-specific sites that can gain insights into the market performance of your competitors.

Monitoring reviews can help you identify patterns in customer feedback, understand your competitors’ strengths and weaknesses, and gauge overall market sentiment. This information helps fill gaps in the market, improve value proposition, and position your business more effectively.

A thorough competitor analysis puts sales and marketing teams in a great position to generate and convert as many leads as possible. Identifying an unsatisfied customer who uses your competitor’s product creates a window of opportunity to send out a personalized ad that addresses the specific pain point of that same unsatisfied customer. But even so, getting your competitors’ reviews is easier said than done and requires having the right tools. In the next sections, we’ll talk about the tools for review monitoring.

Introduction to the social scraper, reverse lookup, and professional contacts APIs for review monitoring

Social Scraper API
Scraping reviews from social media platforms like Instagram or TikTok can be a real hassle if you don’t have the right tools because these platforms have measures to prevent scraping. Fortunately, The Social Proxy social scraper provides various endpoints to get specific and valuable data from these platforms.

Using the social scraper API enables you to retrieve data from these platforms without issues. Instead of manually setting up proxies to prevent detection and avoid IP blocks, you can focus on getting your data while the social scraper API handles everything on the backend.

The Social Proxy also offers mobile and residential proxies that you can integrate into your code to scrape data from platforms like Trustpilot, g2, or Capterra.

Once you’ve gathered reviews about competitors, you’ll want to get more information about the reviewer, enabling your sales team to send them personalized ads. This is where reverse lookup and professional contacts APIs come into play. Let’s take a closer look at both of them.

Reverse lookup APIs retrieve detailed information about a user based on their email address, phone number, or IP address. For example, suppose a user leaves an unsatisfactory review about your competitor’s product. You can use a reverse lookup API to gather data about that user – their online presence, location, and potential influence. This information can be used to help your team understand the user’s background and use it to tailor a competitive strategy to the user.

Professional contacts APIs provide access to a wealth of professional information about a reviewer, such as their job title, company, industry, and LinkedIn profile. You can identify potential leads, partners, or threats by connecting the dots between a review and the reviewer’s professional background. For instance, if a reviewer is a key decision-maker at a company within your target market, this data can be used to create personalized marketing strategies or sales pitches.

In the next sections, we’ll follow a step-by-step guide on scraping reviews about competitors on social media using the social scraper API. Then, we’ll use reverse lookup and professional contact APIS to get more information about the reviewers and see how the data obtained can be utilized for your sales strategy.

Step-by-step guide to scraping competitor reviews on social media using The Social Proxy’s Social Scraper API

In the next section, you will use The Social Proxy Social Scraper API to gather social media reviews about competitors’ products.

Step 1: Set up an account with The Social Proxy

  • Visit The Social Proxy’s official website.
  • Click “Login” if you already have an account. To create a new account, click “Get Started” and follow the next steps.
  • Fill out the required fields in the signup form and click “Sign Up.”

Click on the account verification link sent to your email from The Social Proxy.

Access your dashboard on The Social Proxy and click on “Buy Proxy” to select a plan.

Choose a plan: In the buy proxies page, select “Scraper API,” choose your subscription type, and click “Checkout.”

Provide payment details: Fill out your payment information and click “Sign up now.” Once you’ve signed up, you can proceed to use the Scraper API.

Generate your Scraper API keys: You need to generate your keys before you can start making API calls to the Scraper API. In the side menu, click “Scraper API” and select “Scraper API.”

Click on “Generate API KEY”.

Copy your credentials: Copy your Consumer Key and Consumer Secret – you will need them in your code.

Step 2: Scrape the competitor reviews using the Social Scraper API

The Social Proxy provides social scraper APIs that give you access to data from different social media platforms using various programming languages.

Install Node.js: Ensure you have Node.js installed on your computer. Run the command below to check.

				
					node -v


				
			

If you don’t have Node.js installed, you can download it here.

Create a project folder and open the folder with the code editor of your choice.
Initialize a new Node.js project in the folder by running the command:

				
					npm init -y


				
			

Inside the folder, install the following dependencies using the command:

				
					npm install request csv-writer


				
			

For this demo, we’ll use the following scenario:

Shopify, a leading e-commerce platform, is competing with WooCommerce, a popular WordPress e-commerce plugin. Shopify’s sales team wants to understand how customers perceive WooCommerce to better position Shopify as the superior e-commerce solution.

The sales team must identify channels where WooCommerce users frequently discuss their experience. One such channel is social media. Next, they will scrape posts mentioning WooCommerce. The reverse lookup API will then be used to gain more information about the person who made the posts. This will allow the Shopify sales team to analyze the posts and scout potential leads.

Create the script
Create a new file, for example, reviewScraper.js, and add the following script that integrates the social scraper API and performs a search on LinkedIn and Facebook using the keyword “WooCommerce review”:

				
					const request = require('request');
const csv = require('csv-writer').createObjectCsvWriter;

// Function to scrape Facebook
function scrapeFacebook(callback) {
  const options = {
    method: 'POST',
    url: 'https://thesocialproxy.com/wp-json/tsp/facebook/v1/search/posts?consumer_key={CONSUMER_KEY}&consumer_secret={CONSUMER_SECRET}',
    headers: {
      'Content-Type': 'application/json',
    },
    body: JSON.stringify({
      typed_query: 'WooCommerce review',
      start_date: '2024-08-15',
      end_date: '2024-08-24',
    }),
  };

  request(options, function (error, response) {
    if (error) throw new Error(error);
    try {
      const data = JSON.parse(response.body);
      const results = data.data.results.map((result) => ({
        platform: 'Facebook',
        name: result.actors[0].name,
        text: result.message,
        url: result.url,
      }));
      callback(null, results);
    } catch (parseError) {
      callback(parseError);
    }
  });
}

// Function to scrape LinkedIn
function scrapeLinkedIn(callback) {
  const options = {
    method: 'GET',
    url: 'https://thesocialproxy.com/wp-json/tsp/linkedin/v1/search/posts?consumer_key={CONSUMER_KEY}&consumer_secret={CONSUMER_SECRET}&keywords=WooCommerce review',
    headers: {
      'Content-Type': 'application/json',
    },
  };

  request(options, function (error, response) {
    if (error) throw new Error(error);
    try {
      const data = JSON.parse(response.body);
      const results = data.data.posts.map((post) => ({
        platform: 'LinkedIn',
        name: post.author.first_name + ' ' + post.author.last_name,
        text: post.text,
        url: post.url,
      }));
      callback(null, results);
    } catch (parseError) {
      callback(parseError);
    }
  });
}

// Function to save results to CSV
function saveToCSV(data) {
  const csvWriter = csv({
    path: 'WooCommerce.csv',
    header: [
      { id: 'platform', title: 'Platform' },
      { id: 'name', title: 'Name' },
      { id: 'text', title: 'Text/Message' },
      { id: 'url', title: 'URL' },
    ],
  });

  csvWriter
    .writeRecords(data)
    .then(() => console.log('CSV file has been written successfully'));
}

// Main function to run both scrapes and combine results
function reviewScraper() {
  scrapeFacebook((fbError, fbResults) => {
    if (fbError) {
      console.error('Facebook scraping error:', fbError);
      return;
    }

    scrapeLinkedIn((liError, liResults) => {
      if (liError) {
        console.error('LinkedIn scraping error:', liError);
        return;
      }

      const combinedResults = [...fbResults, ...liResults];
      saveToCSV(combinedResults);
    });
  });
}

// Run the scrapes
reviewScraper();

				
			

This script does the following:

  • It defines separate functions for scraping Facebook or LinkedIn.
  • Each scraping function formats the data into a consistent structure with platform, name, text, and URL.
  • A saveToCSV function is defined to save the combined results to a CSV file.
  • The reviewScraper function orchestrates the process, running both scrapes and then combining and saving the results.
  • The script uses the csv-writer package to create the CSV file.

To run this script:

Make sure you have the required packages installed (request, csv-writer).
Replace {CONSUMER_KEY} and {CONSUMER_SECRET} with your actual keys.
Run the script using Node.js.

The output will be saved in a file named WooCommerce.csv in the same directory as the script. You can then open the CSV file using Excel.

Step 3: Use reverse lookup and professional contact APIs to enrich the data collected from reviews

After gathering reviews from your different platforms, you can use the Scraper API reverse lookup API to get enriched data about each individual by using their name, email, or phone number.

To integrate the reverse lookup API into our script, follow these steps:

Set up the reverse lookup function
First, you’ll need to create a function to call the reverse lookup API using the names from the scraped data. This function will send a POST request to the API endpoint, passing through each name and retrieving additional information such as professional details, contact information, or other relevant data.

Use this code:

				
					const request = require('request');
const csv = require('csv-writer').createObjectCsvWriter;

// ... (keep your existing scrapeFacebook and scrapeLinkedIn functions)

function reverseLookup(name) {
  return new Promise((resolve, reject) => {
    const options = {
      method: 'POST',
      url: 'https://thesocialproxy.com/wp-json/tsp/reverse-lookup/v1/name?consumer_key=ck_7265df97238d40dfd5d217449ebe55ecf2ec86ea&consumer_secret=cs_470e26fa26ed5116bc40c5bc6902c8601c0a8a3e',
      headers: {
        'Content-Type': 'application/json',
      },
      body: JSON.stringify({
        typed_query: name,
      }),
    };

    request(options, function (error, response) {
      if (error) {
        reject(error);
        return;
      }
      try {
        const data = JSON.parse(response.body);
        if (
          data.data &&
          data.data.data &&
          data.data.data[0] &&
          data.data.data[0].pluto &&
          data.data.data[0].pluto[1]
        ) {
          const userData = data.data.data[0].pluto[1];
          resolve({
            address: userData.address || '',
            city: userData.city || '',
            gender: userData.gender || '',
            age: userData.age || '',
          });
        } else {
          resolve({
            address: '',
            city: '',
            gender: '',
            age: '',
          });
        }
      } catch (parseError) {
        console.error('Error parsing reverse lookup data:', parseError);
        resolve({
          address: '',
          city: '',
          gender: '',
          age: '',
        });
      }
    });
  });
}

				
			

Enrich the scraped data
You can now use the reverseLookup function to enrich the data for each person in the reviews you scraped from LinkedIn and Facebook. This involves iterating over the combined results and performing a reverse lookup for each name.

				
					// Function to enrich data with reverse lookup information
async function enrichData(data) {
  const enrichedData = [];
  for (const item of data) {
    try {
      console.log(`Performing reverse lookup for: ${item.name}`);
      const lookupData = await reverseLookup(item.name);
      enrichedData.push({
        ...item,
        ...lookupData,
      });
    } catch (error) {
      console.error(`Error enriching data for ${item.name}:`, error);
      enrichedData.push(item); // Add original data if enrichment fails
    }
  }
  return enrichedData;
}

				
			

Save the enriched data to a CSV file
Once you’ve enriched the data, save the results to a new CSV file. This file will include all the original fields plus any additional information retrieved from the reverse lookup API.

				
					// Modified saveToCSV function to include new fields
function saveToCSV(data) {
  const csvWriter = csv({
    path: 'WooCommerce_enriched.csv',
    header: [
      { id: 'platform', title: 'Platform' },
      { id: 'name', title: 'Name' },
      { id: 'text', title: 'Text/Message' },
      { id: 'url', title: 'URL' },
      { id: 'address', title: 'Address' },
      { id: 'city', title: 'City' },
      { id: 'gender', title: 'Gender' },
      { id: 'age', title: 'Age' },
    ],
  });

  csvWriter
    .writeRecords(data)
    .then(() => console.log('Enriched CSV file has been written successfully'));
}

				
			

Integrate everything and run the enrichment
Update your main function to scrape, enrich, and save the data.

				
					async function runScrapesAndEnrich() {
  try {
    const fbResults = await new Promise((resolve, reject) => {
      scrapeFacebook((error, results) => {
        if (error) reject(error);
        else resolve(results);
      });
    });

    const liResults = await new Promise((resolve, reject) => {
      scrapeLinkedIn((error, results) => {
        if (error) reject(error);
        else resolve(results);
      });
    });

    const combinedResults = [...fbResults, ...liResults];
    console.log(`Total scraped results: ${combinedResults.length}`);
    
    const enrichedResults = await enrichData(combinedResults);
    console.log(`Enrichment complete. Saving to CSV...`);
    
    saveToCSV(enrichedResults);
  } catch (error) {
    console.error('Error in scraping or enriching data:', error);
  }
}

// Run the scrapes, enrich data, and save to CSV
runScrapesAndEnrich();

				
			

After running the code, you should see this on the terminal.

Now open the CSV using an Excel sheet, and you will notice new feeds with more enriched data.

Utilizing the scraped data for sales strategy

Step 1: Analyze reviewer information

Once you have gathered and enriched data from LinkedIn and Facebook reviews, you can analyze it and use it to identify potential leads for your sales strategy. Start by categorizing the reviewers based on the enriched data, such as job, address, city, and industries. This allows you to segment the audience into relevant groups: decision-makers, influencers, or end-users.

Focus on key metrics like engagement frequency, sentiment analysis, and the reviewer’s professional influence within their industry. Use tools like Excel or more advanced analytics software to filter and visualize the data, identifying patterns that may indicate strong interest in your competitors’ products or services. Reviewers who consistently engage with specific types of content or express dissatisfaction with current solutions are prime candidates for outreach.

Target these individuals as potential leads by crafting personalized sales pitches that address their pain points or align with their professional interests; if done correctly, these individuals have the potential to become marketers of your brand. The enriched data enables you to tailor your communication more effectively, increasing the chances of converting these leads into customers.

Step 2: Set up alerts for new reviews

To stay informed about new WooCommerce reviews on LinkedIn and Facebook in real time, let’s implement an alert system into your script. It will periodically check for new posts and send you an email notification when new posts are detected.

				
					const request = require('request');
const csv = require('csv-writer').createObjectCsvWriter;
const nodemailer = require('nodemailer');
const fs = require('fs');

// ... (keep your existing scrapeFacebook, scrapeLinkedIn, reverseLookup, enrichData, and saveToCSV functions)

// Function to send email alert
async function sendEmailAlert(newPosts) {
  // Create a transporter using SMTP
  let transporter = nodemailer.createTransport({
    host: "smtp.your-email-provider.com",
    port: 587,
    secure: false, // Use TLS
    auth: {
      user: "your-email@example.com",
      pass: "your-email-password"
    }
  });

  // Compose the email
  let message = {
    from: '"WooCommerce Alert" <your-email@example.com>',
    to: "recipient@example.com",
    subject: "New WooCommerce Reviews Alert",
    text: `New WooCommerce reviews have been posted:\n\n${newPosts.map(post => 
      `Platform: ${post.platform}\nName: ${post.name}\nText: ${post.text}\nURL: ${post.url}\n\n`
    ).join('')}`
  };

  // Send the email
  let info = await transporter.sendMail(message);
  console.log("Alert email sent: %s", info.messageId);
}

// Function to read the last checked timestamp
function getLastCheckedTimestamp() {
  try {
    return fs.readFileSync('last_checked.txt', 'utf8');
  } catch (error) {
    return null;
  }
}

// Function to write the last checked timestamp
function setLastCheckedTimestamp(timestamp) {
  fs.writeFileSync('last_checked.txt', timestamp);
}

// Function to check for new posts
async function checkForNewPosts() {
  const lastChecked = getLastCheckedTimestamp();
  const currentTime = new Date().toISOString();

  try {
    const fbResults = await new Promise((resolve, reject) => {
      scrapeFacebook((error, results) => {
        if (error) reject(error);
        else resolve(results);
      });
    });

    const liResults = await new Promise((resolve, reject) => {
      scrapeLinkedIn((error, results) => {
        if (error) reject(error);
        else resolve(results);
      });
    });

    const combinedResults = [...fbResults, ...liResults];
    
    // Filter for new posts
    const newPosts = combinedResults.filter(post => {
      // This assumes the post has a 'timestamp' field. Adjust as necessary.
      return !lastChecked || new Date(post.timestamp) > new Date(lastChecked);
    });

    if (newPosts.length > 0) {
      console.log(`Found ${newPosts.length} new posts`);
      const enrichedNewPosts = await enrichData(newPosts);
      await sendEmailAlert(enrichedNewPosts);
      await saveToCSV(enrichedNewPosts);
    } else {
      console.log('No new posts found');
    }

    setLastCheckedTimestamp(currentTime);
  } catch (error) {
    console.error('Error checking for new posts:', error);
  }
}

// Function to start periodic checking
function startPeriodicChecking(intervalMinutes) {
  console.log(`Starting periodic checking every ${intervalMinutes} minutes`);
  setInterval(checkForNewPosts, intervalMinutes * 60 * 1000);
}

// Start the periodic checking
startPeriodicChecking(60); // Check every 60 minutes

				
			
  • The code functions to read and write the last checked timestamp, allowing us to track when we last checked for new posts.
  • The sendEmailAlert function uses nodemailer to send email alerts when new posts are found.
    The checkForNewPosts function:
    • Retrieves the last checked timestamp
    • Scrapes new data from Facebook and LinkedIn
    • Filters for posts newer than the last checked timestamp
    • Enriches the data if new posts are found, sends an email alert, and saves to CSV
  • The startPeriodicChecking function runs checkForNewPosts at regular intervals.

To use this code, you’ll need to install the nodemailer package.

				
					npm install nodemailer


				
			
  • Replace the email configuration in the sendEmailAlert function with your SMTP server details and email addresses. (Learn more about SMTP here.)
  • Use the code below to adjust the scrapeFacebook and scrapeLinkedIn functions to include a timestamp for each post if they don’t already:
				
					// Function to scrape Facebook
function scrapeFacebook(callback) {
  const options = {
    method: 'POST',
    url: 'https://thesocialproxy.com/wp-json/tsp/facebook/v1/search/posts?consumer_key={CONSUMER_KEY}&consumer_secret={CONSUMER_SECRET',
    headers: {
      'Content-Type': 'application/json',
    },
    body: JSON.stringify({
      typed_query: 'WooCommerce review',
      start_date: '2024-08-15',
      end_date: '2024-08-24',
    }),
  };

  request(options, function (error, response) {
    if (error) throw new Error(error);
    try {
      const data = JSON.parse(response.body);
      const results = data.data.results.map((result) => ({
        platform: 'Facebook',
        name: result.actors[0].name,
        text: result.message,
        url: result.url,
        timestamp: new Date(result.creation_time * 1000).toISOString() // Convert Unix timestamp to ISO string
      }));
      callback(null, results);
    } catch (parseError) {
      callback(parseError);
    }
  });
}

// Function to scrape LinkedIn
function scrapeLinkedIn(callback) {
  const options = {
    method: 'GET',
    url: 'https://thesocialproxy.com/wp-json/tsp/linkedin/v1/search/posts?consumer_key={CONSUMER_KEY}&consumer_secret={CONSUMER_SECRET}&keywords=WooCommerce review',
    headers: {
      'Content-Type': 'application/json',
    },
  };

  request(options, function (error, response) {
    if (error) throw new Error(error);
    try {
      const data = JSON.parse(response.body);
      const results = data.data.posts.map((post) => ({
        platform: 'LinkedIn',
        name: post.author.first_name + ' ' + post.author.last_name,
        text: post.text,
        url: post.url,
        timestamp: new Date(post.created_at).toISOString() // Convert to ISO string
      }));
      callback(null, results);
    } catch (parseError) {
      callback(parseError);
    }
  });
}

				
			

Next, decide on an appropriate interval for checking (currently set to 60 minutes) and adjust if needed. This system will run continuously, checking for new posts at the specified interval, sending email alerts, and updating the CSV file when new posts are found. With this information, your sales team will have all they need to create a sales strategy using the reviews obtained from the script.

Conclusion

Competitor monitoring is not a one-time task. It’s an ongoing, essential practice to stay up to date with market changes. This can be achieved using tools like the social scraper API, which ensures that your data collection is straightforward as you gather accurate and comprehensive review data.

You can further enrich the data by leveraging reverse lookup and professional contacts APIs to gather valuable insights about reviewers. These APIs allow you to understand customer sentiment better and identify potential leads. Together, these tools can provide you with a clearer picture of your competitors’ strengths and weaknesses and empower your sales team to make informed, strategic outreach decisions. If your team is not currently leveraging these tools to strengthen your sales strategy, consider the potential leads you might be leaving on the table.

Accessibility tools

Powered by - Wemake