Web scraping case study for manufacturer: Control of compliance with the MSRP
Find out how a household appliances manufacturer monitors compliance with manufacturer’s suggested retail price (MSRP) by retailers who sell its products.

Customer challenges
The customer sells household appliances to large retailers. Our customer sets prices to his products, that is why he needs to control what price retailers sell their products at.
Retailers are large e-commerce company, which means they need to check prices in several regions at once. In addition to prices, the customer also wanted to know about the availability of goods in each region.
According to customer’s requirements, the price check must be performed regularly, every day.
The report should be generated according to the certain template and sent by email.

The main details of the project
The retailers’ websites have good protection systems from scraping
That is why we have to develop a specific system which will help to avoid blocking the scraper.
Prices vary
depending
on the region
That is why it is necessary to set up the scraper to the desired geographical location
The report should be configured according to the customer's template
The customer has his own report in xlsx format, so we need to implement the same template
Solution
Scraper development to collect all the necessary information.
Daily data collection with automatic report generation and sending by email.
Project stages
-
1
Customer requirements
We gathered customer requirements and reviewed the retailers’ websites
-
2
Scraper development
We developed a scraper that collects all the necessary data
-
3
Generating a report
We developed a program which generated a report in the particular format
-
4
Setting up data transfer
We set up sending the report with the data to the customer's email
-
5
Testing and launch
System stability testing and after that we made a commercial launch of the program.
Implementation
A few words about customer’s task implementation
Data collection
-
The customer needs to monitor prices only for his own products, for this we have set up data collection only for the certain goods.
-
Since prices and availability of goods in different regions differ for each retailer, we have set up data collection in all required cities. Data is uploaded in the report for each city separately.
-
In order to ensure that prices and availability are collected without blocking and within predictable timeframes, we have configured scraper work so that it does not load the retailers’ websites and used a number of micro-services that allow us manage data collection flexibly.
Creating a report
-
One of the customer’s requirements was to provide a report based on the template in xlsx format. That is why we customised the report generation so that the final file with collected data exactly matched the one requested by the customer.
Price monitoring schedule and report sending
-
The customer wants to receive data every day, at a certain time. To meet the deadlines, we have set the time required for scraping and configured sending the report to the customer’s email by the right time.

Project results
1. The entire process of data collecting and delivery process were fully automated.
2. The customer is able to detect retailers deviating from the MSRP, he regularly receives up-to-date information about prices and availability on the retailers’ websites.
3. The data is provided in the convenient format and at the right time.
How to control MSRP for brands and manufacturers?
PricingCraft is a price monitoring service.
We help to build a competitive business and increase profits by controlling prices.
