The hospitality industry thrives on large volumes of data, and most hotel operators today agree that it is becoming more critical than ever to attend to data quality issues head-on. According to a recent survey conducted by Gartner, bad data today costs companies an average of $14 million per year. This cost is likely to accelerate as companies acquire more data and continue to rely on it for their revenue decisions.
To address this challenge, RateGain today announced the release of DataSure, a machine learning-enabled data quality and data profiling framework that is designed to speedily identify and fix data sufficiency issues for every report a hotel generates. The solution tackles data profiling for hotel clients by running the data through a series of a quality checklist.
The accuracy check process is executed through in-house built artificial intelligence and machine learning systems.
The launch of DataSure is intended to improve and enhance the accuracy level of the data so that hotels are better equipped to make more effective and informed decisions.
RateGain launched Optima in October 2016 as a solution to provide comprehensive rate-intelligence to hotels by tracking more than 500+ OTAs, meta-search sites, GDS, brand sites & mobile apps and more than 900,000+ rooms type data. In April of last year, the company launched a “lightning refresh” feature in Optima that allows hotels to get real-time competitor data in less than 60 seconds.
Today Optima is widely used to leverage the power of price intelligence and maintain rate parity across all channels. The launch of DataSure should help further ensure that RateGain’s hotel customers are only processing and relying their revenue decisions on the highest quality of data possible, helping to further eliminate the business risks that hotels face due to inaccurate data.
According to sources, DataSure will be rolled out to all existing customers of RateGain early this year.