How much is fare scraping costing the travel industry?

In a 2021 survey by Netacea, 96% of travel companies said their website had been attacked by bots over the previous 12 months.
Yasmin Duggal Cybersecurity Content Specialist

Scraper bots make up the worst of bad bot traffic for the travel industry, with sites witnessing over 90% of traffic attributed to fare scraping. Whilst this activity can be benign or even used for positive means, if uncontrolled it can impact top line revenue, bottom line profits and customer experience.

In a recent webinar, Netacea’s Head of Threat Research, Matthew Gracey-McMinn, and Enterprise Sales Manager for Travel and Tourism, Graeme Harvey, were joined by Ann Cederhall, Travel Technology Specialist at LeapShift.

The panel discussed the effect of the Covid-19 pandemic on the travel and tourism industry, the damage of fare scraping and excess web requests on travel booking sites, and how travel companies can take back control of their look-to-book ratio.

Catch up on all the key takeaways below or watch the full webinar on demand here.

What is the look-to-book ratio?

The look-to-book ratio is the number of requests made per booking on an online travel site.

Requests can be made by humans or bots, and the lower the look-to-book ratio, the better. A low look-to-book ratio means conversions are high from genuine customers browsing the website. However, scraper activity can cause look-to-book ratios to exceed several thousand. A high look-to-book ratio inflates the number of requests versus the number of conversions.

When scraper bots pull information from a website, they create excess web requests which, in turn, negatively impacts your look-to-book ratio. Increased competition driven from the pandemic, and the popularity of dynamic pricing, means this is fast becoming a top threat for the travel industry.

How does fare scraping work on travel booking websites? 

In travel, web scraper bots are mainly used to collect fare and availability information by rival companies and aggregator sites, used for price comparison.

But also targeting travel booking sites are scraper bots, used to discover and publicize the availability of products and services such as flights, hotels or car rentals.

Attackers advertise the scraped information at lower price points on a secondary site, motivated by the financial reward of charging commission, stealing personal data, or generating advertising revenue.

Scraping is also often used to gather the data needed for more sophisticated or damaging attacks such as ticket spinning or denial of inventory. Preventing malicious scraper bots can cut out these further attacks early as the attackers do not have the data they need to progress.

‘The attackers who were working in travel have moved out of the space over the last year, but they are now coming back with new skills and techniques learned from other industries [during the Covid-19 pandemic].’

– Matthew Gracey-McMinn, Head of Threat Research at Netacea

What damage is inflated look-to-book ratios causing to travel companies? 

‘The cost for an airline of having excess transactions – of having pricing systems being queried – can be very substantial, and also uncontrollable.’

– Ann Cederhall, Travel Technology Specialist at LeapShift

Excess traffic caused by aggressive scraping and high look-to-book ratios negatively impact airlines and travel companies both on their bottom line and on their technical performance.

Business costs 

  • Additional costs (up to millions per year) to third-party services like Metasearch engines and GDS booking fees, which charge based on traffic volumes
  • Extra costs for SIEM and anti-fraud solutions, again which charge based on traffic volumes
  • Loss of pricing visibility leading to competitive pricing disadvantage
  • Misleading analytics from inaccurate number of website viewers interested in a certain product
  • Loss of ancillary revenues (or “add-ons”) e.g., hotels, travel insurance and car hire

Technical costs 

  • Excessive infrastructure costs (up to 50%) used to serve bots
  • IT teams stretched to deal with bots away from daily tasks
  • Slowed website performance leading to negative user experience
  • Costly downtime in extreme cases

How to prevent scraping, excess transactions and high look-to-book ratios 

The travel industry is one of the most severely affected by bad bots and has been since the advent of online travel. As bots grow in sophistication and volume, it is crucial for travel websites to accurately detect and bad bots without affecting good bots necessary for the steady running of your website, and genuine users’ experience.

As Ann Cederhall concluded in the webinar:

‘It’s all about control.’

Download the full guide: Fare Scraping and Excess Transactions: The Real Cost to the Travel Industry

Avoid being outplayed by your competitors
Netacea provides actionable intelligence about the quality of traffic coming in, helping you
make informed decisions about how much you want to invest in acquiring new users.

Related posts:

Yasmin Duggal is a technical writer at Netacea specializing in cybersecurity. In her current role in the marketing team, she works closely with the Threat Research team to produce detailed yet accessible content on the latest trends within bot management and the wider cybersecurity landscape. In her previous position at a cloud hosting company, she gained experience working with professionals from across the tech industry.
Related Resources

Protecting Customers of Leading Baked Goods Brand from...

07th Oct 2021 / 17:03 VIEW case study

Customer Loyalty: How are bots exploiting business logic?

28th Jun 2021 / 16:32 VIEW whitepaper

Netacea Quarterly Index: Top 5 Scalper Bot Targets of ...

15th Nov 2021 / 14:10 VIEW guide