Introduction to DataScope Select (DSS) REST API

Christiaan Meihsl
Head of Developer Content, Platform Application Developer Head of Developer Content, Platform Application Developer

Last Updated: 20 Mar 2024


This article provides a brief introduction to the DataScope Select API. For a more extensive presentation which covers this content and more, you can have a look at the recorded webinar Introduction to DataScope Select REST API.

Data sourcing conundrum

In addition to using real-time streaming data, our customers often require timely and easy access to high quality non-streaming pricing and reference data, with cross-asset and worldwide coverage.

Requirements for historical pricing data include the end-of-day prices, intraday bars with varying widths, or even tick and market depth data.

There are several use cases for this:

  • portfolio managers and fund administrators seek validated and evaluated pricing data, corporate actions and benchmarks;
  • credit analysts use company and sector specific information like credit ratings, estimates and more;
  • risk managers require accurate data for end-of-day calculations;
  • trading systems require time series for backtesting;
  • back office and settlement departments need validated end-of-day prices;
  • compliance officers use intraday time series;

The list of available data sets is of course not limited to that, you can also get commodities, analytics, estimates, entity data or news.

Meeting all the requirements is essential but can be very challenging, especially when working with multiple data sources.

Our answer to the challenge

DataScope Select, or DSS, fits all of these requirements nicely. It is an internet hosted enterprise platform for non-streaming data. It supports a wide variety of data sets, with excellent cross-asset and worldwide coverage, and was specifically designed to handle bulk requests with thousands of instruments in them.

You can use it from a browser and retrieve data files through FTP, but here we are going to examine the use of its REST API.

Using DataScope Select

Extraction mechanisms

Data extraction is a large topic in itself, but I would like to briefly walk you through the process. There are two basic mechanisms to request data:

  • scheduled - the extraction occurs at a pre-defined moment in time, either as a single or recurring event; these extractions can be created using either the web interface or the API;
  • on demand - an immediate extraction, requested through high level calls only available via the API.

The scheduled approach uses several artifacts which you must create and manage on the DataScope Select server. On demand requests are self-contained, they do not require anything to be stored on the server.

The choice of the extraction mechanism is determined by your workflow requirements. The two mechanisms are not mutually exclusive, so you can use both, for instance using scheduled extractions for your regular data needs, and using on demand extractions for ad hoc queries.

Scheduled extraction

To schedule an extraction, you need to:

  • create and populate an instrument list;
  • define a report template, by choosing one from a list of available defaults, and customizing it by selecting the data fields you require. The list of available fields is specific to each report template;
  • set an extraction schedule, single or recurring, triggered at a specific time, or by data availability. A schedule refers to an existing instrument list and report template.

The above artifacts are stored under your user profile on the DataScope Select servers.

Once you have these, you can query the DataScope Select server for the status of your extraction. When the extraction has completed you can retrieve the results.

All these tasks can be performed using either the user interface or the API.

In order to give you an idea of what content is available, here is the list of report templates (as of December 2016):

  • pricing data: end-of-day regular and premium, intraday regular and premium, single historical price, time series pricing, tick history;
  • entity data: audit, detail and hierarchy;
  • reference data: ratings, bond schedules, factor history for MBS and tranches, fund allocation, ownership, symbol cross-reference, terms and conditions;
  • analytics: fixed income, technical indicators and Starmine;
  • corporate actions: standard events, IPO events and ISO 10522;
  • commodities: independent price assessments, fundamentals, forward prices;
  • estimates: summary, detailed estimates, actuals, company-level and detail footnotes;
  • news: news analytics, commodities and news items;
  • pricing and reference data composite reports;

As you can see, the list is quite extensive. New data sets are added regularly in response to requests from market participants.

On demand extraction

On demand extractions are easier to use, as they do not require you to pre-define instrument lists or report templates on the server.

When you use the API for extractions, it delivers the data directly to your application, thus eliminating the need to manually download the result files.

On demand requests include:

  • an instrument list, this is managed on the client side;
  • a data type, which is a reference to a default report template;
  • a list of selected data fields.

On demand requests are essentially done in a single call, but you retain the same level of control over what and how you request.

Also, none of the extraction steps leave a footprint on the servers, as the requests are generated dynamically.

How to choose the API?

There are two versions of the API that are currently available: SOAP and REST - if you have not yet started using DataScope Select, go for the newer REST API. We also encourage you to consider migrating your existing solutions from SOAP to REST.

Why favour the REST API?

  • more functionality and more data types are available;
  • it offers better performance and higher extraction limits;
  • it supports .NET C# through a native SDK;
  • REST supports asynchronous requests, SOAP does not;

Moreover, as the SOAP API is not being developed further, all new features will only be made available in the REST API.

One of the key distinctive features of the REST API is how it deals with embargoed data. For instance, when you want to get a snapshot of a real-time price of an equity, which is traded on an exchange that you do not have access to, the REST API is going to return the result with a delay that is predefined by the data provider.


In this article we use Postman, an advanced and free of charge client for RESTful APIs. A valid DataScope Select account is also necessary.

Retrieving the authorization token

To obtain an authorization token we send an authentication request to the DataScope API:

  1. Set the endpoint to;
  2. Select the Headers tab and add two items: Prefer:respond-async and Content-Type:application/json;
  3. Select the POST method;
  4. Select the Body tab, then raw and paste the following JSON and change it with your DataScope Select credentials:
"Credentials": {
"Username": "your_username",
"Password": "your_password"
  1. Hit Send;
  2. If everything was done correctly, you receive the following response:
"@odata.context": "$metadata#Edm.String",
"value": "your_token"
  1. Save your token, you need it for the next request.

Building a data request

As an example, let us build an On Demand data extraction request for ownership data.

  1. Open a new tab in Postman and set the endpoint to;
  2. For all requests we must add our token to the Headers section, like this:
               Prefer: respond-async
Content-Type: application/json
Authorization: Token your_token
  1. Select the POST method;
  2. Select the Body tab, then raw and paste the following JSON content:
"ExtractionRequest": {
"@odata.type": "#DataScope.Select.Api.Extractions.ExtractionRequests.OwnershipExtractionRequest",
"ContentFieldNames": [
"Asset Status",
"Asset Status Description",
"Change Sign",
"Currency Code",
"Issuer Name",
"Holdings Report Date",
"Owner Country",
"Owner Name",
"Owner Type",
"Owner Type Description",
"Percent of Shares Outstanding",
"Primary Issue Flag",
"Security Description",
"Shares Held",
"Value Held",
"Shares Changed",
"Value Changed"
"IdentifierList": {
"@odata.type": "#DataScope.Select.Api.Extractions.ExtractionRequests.InstrumentIdentifierList",
"InstrumentIdentifiers": [{
"Identifier": "IBM.N",
"IdentifierType": "Ric"
  1. Hit Send;

For this request we defined three parameters:

  • @odata.type: the default template, in our case OwnershipExtractionRequest;
  • ContentFieldNames: the list of fields we are interested in. These are a subset of the full list which you can display in the GUI, or retrieve by querying the API;
  • IdentifierList: just 1 instrument, IBM.N. You can define several. You can also mix identifier types (ISIN, Cusip, Ric, etc.).

The response

If all the details were set correctly, we can expect either of two HTTP response statuses: 200 OK or 202 Accepted. In the first case, the response contains the requested data, as an array of ownership report objects. The first one looks like this:

"@odata.context": "$metadata#Collection(DataScope.Select.Api.Extractions.ExtractionRequests.ExtractionRow)",
"IdentifierType": "Ric",
"Identifier": "IBM.N",
"Asset Status": "ISS",
"Asset Status Description": "Issued",
"Change Sign": "P",
"Currency Code": "USD",
"Issuer Name": "International Business Machines Corp",
"Holdings Report Date": "2016-09-30",
"Owner Country": "United States",
"Owner Name": "xxxxx Management Group, Inc.",
"Owner Type": "IA",
"Owner Type Description": "Investment Advisor",
"Percent of Shares Outstanding": 0.01,
"Primary Issue Flag": "N",
"Shares Held": 48323,
"Value Held": 7676109,
"Shares Changed": 43883,
"Value Changed": 6970815

If you receive a 202 Accepted response, you need to grab the Location value from the response header, which is a url which you then poll until you receive a 200 OK status with the data.


As you can see, using the REST API is quite straightforward, but this is only the beginning. For more information, you can go to the DataScope Select section of this portal, where you will find many tutorials, code samples and documentation. You can also check out other articles on this topic.