A hyper-scalable, general-purpose machine-learning platform

The Descartes Platform is the foundation on which we build solutions, train models, and generate forecasts. We designed the Descartes Platform to tackle previously intractable compute- and data-intensive problems. And now, you can use it to answer your own questions.

The Descartes Platform provides data pipelines, data libraries, and analytics pipelines.

Processing power for massive data sets

A revolution in sensor technology is creating an explosion in data. Putting big data to work requires a platform that can ingest and process massive datasets. To tackle this problem, we built a supercomputer in the cloud.

Currently, the Descartes Platform ingests 5 terabytes (TB) of near real-time data per day, roughly equivalent to 5,000 hours of standard video. Our current corpus is over 3 petabytes of data (3,000 TB) with the ability to grow much larger. With sensor data growing exponentially, the Descartes Platform is designed to respond elastically to this data explosion and harness it for real-time forecasting.

Using 30,000 processor cores, the Descartes Platform has processed 1 petabyte of data in under 24 hours—among the largest and fastest scientific data processing tasks ever performed for commercial purposes. The platform parallelizes large calculations automatically, scaling cloud resources on demand, in response to both calculation size and desired processing time. Thus, data collected over decades can be processed in hours.

High processing rates also enable us to quickly iterate models. We've tested thousands of models and implemented a process to continually improve them.

Total data corpus in petabytes

A huge data library

Our original mission was to better understand one of the largest datasets: satellite imagery. We have full imagery archives (some including data only a few hours old) from hundreds of satellites. Our imagery is available in standardized formats for side-by-side comparison of sources. Cloud masks and atmospheric correction are also available.

In the course of building our corn and soy production models, we found that satellite imagery becomes more valuable when augmented with other datasets. Connecting disparate datasets has a network effect. And increasing data volume and quality is as important as improving algorithms. For those reasons, the Descartes Platform is built to ingest virtually any kind of data, including weather data, commodity price histories, web crawls, and sentiment analysis from social media networks.

All this data is stored in a manner that's efficient for building models and making forecasts. And it's all available to you.

Satellites

Weather

Pricing

Sentiment

The Descartes Platform ingests data from a wide range of sources.

A new way of doing science

A series of new technologies—the sensor revolution, big data, machine learning, and more—is changing the way science is done. As the marginal cost of computing falls to nearly zero, processing becomes a utility, and scientists can make massive calculations on-demand.

Sampling parts becomes measuring the whole; experiments become continuous monitoring; and models are tested and refined almost continuously. A new way of doing science is born.

We launched the Descartes Platform so that we could advance the science of forecasting and ask new kinds of questions. Now, you can participate in this new way of doing science, too. Descartes Platform enables you to apply state-of-the-art machine learning techniques to the largest datasets quickly, so that you have the information you need to decide and act.

Want to understand how the Descartes Platform can unlock insights in your enterprise data?