Superalgos Markets Research fetches historical and live market data from exchanges and puts it on your desktop, along with unique tools to crunch the numbers.
The data mining system within the Superalgos Platform is designed with extreme flexibility in mind. It allows market data to be processed and reprocessed applying multiple layers of calculations, and building datasets that may serve as input for further calculations.
The definitions required to process an input dataset and output a new, custom dataset are created on a visual interface. This includes the definitions of data dependencies and the architecture of the output dataset. All you need to code is the calculation procedure.
The icing on the cake is the robust data visualization feature with which you may create plotters that render graphic representations of data over a timeline. The visualization may include typical standard market information such as candles or indicators too.
Superalgos Markets Research lets you efficiently crunch market numbers, with extreme ease.
You start with raw market data in the form of trades or candles coming directly from the exchanges of your choice. The data is fetched dynamically and stored locally in flat files and standardized datasets. On top of raw market data, the system provides several popular indicators that process raw data and output elaborate datasets. The processing is done in parallel for all time frames, and each time frame is stored in separate flat files. This is the data you may use as input for further processing.
The tools Superalgos Markets Research features for processing data are the same tools we use to build the indicators shipping with the Superalgos Platform. Building an indicator or any form of analytical study involves setting up several configurations on a visual interface. You first define the processes, establishing what the input datasets are, and establishing dependencies with the processes that provide the input datasets. This allows data to be processed in real-time, in sync with the input sources.
Then you define the outputs. The same process may perform different sets of calculations and therefore, may output different products. Each product results in a dataset, that is, a collection of flat files spanning all desired time frames. Again, the whole set up is done using the visual interface and requires no coding. Instead, you create definitions with a few clicks and establish relationships among processes with intuitive proximity-drag operations. It's pretty much like building a computer program with Legos!
All the code you need to write is the JavaScript code corresponding to the actual calculations procedure, that is, the mathematical calculations you wish to run on the input dataset.
To give you an idea of how little coding is required to process data with Superalgos Markets Research, the snippet here shows all the code required to calculate the Simple Moving Average indicator.
That's all the code!
Needless to say, Superalgos Markets Research offers a robust infrastructure that handles complexity under the hood so that you don't have to worry about it.
Create dynamic and interactive custom visualizations of any sort of market data without a single line of code.
To complement the amazing data processing capabilities explained before, Superalgos Markets Research offers a unique set of tools to produce data visualizations. The visual interface allows the set-up of plotters, that is, devices that may produce elaborate graphics based on configurations, with literally zero coding required.
The flat files your bots write as outputs feature one record per unit of time, in each of the standard time frames. Once you define how each record should be represented graphically on the screen, the graphics engine built in the system takes care of dynamically rendering the graphics for all records on the screen.
The configuration is quite simple. You define coordinate points using the timeline as the X-axis, and whatever data you may have calculated as the Y-axis. So, for each calculated value, there is a DateTime. Then, you define fill and stroke styles for polygons that use those coordinate points as vertices, et voila!