Feed Your Analytics Program with Good Data
Add geospatial data to your mixing bowl.
- Problem: Your data science program requires high-quality data quickly.
- Solution: Satellites + cloud + AI
- Benefit: Decrease risk, minimize consequence, increase safety
Data science is transforming industry as connectivity and cloud computing provide analysts with the power to optimize every corner of a company’s operation. Some of the most enthusiastic groups embracing Satelytics’ geospatial analytics are data scientists who want high-quality data, the newest data, and lots of it.
Let’s hit on each of these three data topics as they pertain to geospatial analytics and the available offerings.
Quality Data Sources
Satelytics’ algorithms focus on analysis of multispectral infrared data. We are agnostic to the source of the data, though most projects utilize satellite imagery because it provides the most efficient and cost-effective data for very large asset areas. Satelytics tasks commercial satellites to “look” at specific areas of interest. Commercial satellite vendors offer data that is highly accurate. Sensors are calibrated regularly ensuring that passive infrared wavelength detection remains constant. With pixel sizes down to one square foot, detection and quantification are highly accurate, identifying even the smallest constituents and changes. Quality data sources allow data scientists to build quality models.
The Newest Data
Commercial satellite platforms include not only the sensor but the ability to downlink that data quickly to the cloud where Satelytics’ algorithms go to work. To translate satellite data into real-life action, there must be an analytics step in between to transform raw data into actionable alerts. At Satelytics, we affectionately call this process our “sausage machine.” In this process, we ingest data, process it, and produce actionable alerts, directing your field teams only to those spots requiring their skilled attention. We talk about our measurement algorithms a lot, but Satelytics has spent significant time and money on the preprocessing step to get data from multiple sources into a usable product as quickly as possible.
For one of our current customers, we recently tasked a satellite collect over their area of interest on a Saturday at noon. The data was gathered, QA/QC’d, preprocessed, and fed through the sausage machine. Results were posted to the cloud via the customer’s web portal… that afternoon! Unlike other methods you may have explored, we operate in the here and now because when an unwanted event strikes, you want to deal with it immediately before the consequences grow large.
This term is thrown around a lot today, but it is appropriate for geospatial data sets. Our motto is “every pixel, every time.” When we survey 6,000 miles of right-of-way for a customer, that’s a lot of pixels and equates to 42 terabytes of data! With unlimited cloud computing capacity, we quickly reduce this to meaningful alerts for our customers.
Our data scientists have built the best machine learning algorithms in the geospatial business and the accuracy of these measurements continues to grow as they are fed more data. Satelytics’ 40+ algorithms have demonstrated 10% accuracy right out of the box but many of them are within a few percent. The more they do, the more they learn, especially when fed with ground verified feedback from our wonderful customers.
Investment in new observation platforms above the earth’s surface right now is staggering. It is predicted that satellite data will be available every few minutes anywhere on earth in a few short years. The data sources and volumes are multiplying rapidly.
Data scientists unite and compliment your company’s current data programs with geospatial data analytics!