JavaScript API

Overview

DolphinDB JavaScript API is a JavaScript library that encapsulates the ability to operate the DolphinDB database, such as connecting to a database, executing scripts, calling functions, uploading variables, etc.

https://www.npmjs.com/package/dolphindb

Features

  • Use WebSocket to communicate with the DolphinDB database, exchange data in binary format, and support real-time push of streaming data
  • Support running in browser environment and Node.js environment
  • Use TypedArray such as Int32Array in JavaScript to process binary data, with high performance
  • Support serialized upload of data up to 2GB with a single call, and the amount of downloaded data is not limited

Installation

# 1. Install the latest version of Node.js and browser.

# 2. (Optional) Create a new project (skip this step if there's an existing project):
mkdir dolphindb-example
cd dolphindb-example
npm init --yes
# Open the package.json file with an editor and add the line "type": "module", below "main": "./index.js". This enables the ECMAScript modules. In the following code, you can use import { DDB } from 'dolphindb' to import npm packages.

# 3. Install npm packages in your project
npm install dolphindb

Usage

Initializing and Connecting to DolphinDB

NPM

import { DDB } from 'dolphindb'
// The import method using CommonJS modules is const { DDB } = await import('dolphindb')
// Use in browser: import { DDB } from 'dolphindb/browser.js'

// Initially connect to an instance of DolphinDB using the WebSocket URL (without establishing an actual network connection)
let ddb = new DDB('ws://127.0.0.1:8848')

// Encrypt with HTTPS
// let ddb = new DDB('wss://dolphindb.com')

// Establish a connection to DolphinDB (DolphinDB database version should be 1.30.16/2.00.4 or higher)
await ddb.connect()

CDN

<script type="module">
    import { DDB } from 'https://cdn.dolphindb.cn/assets/api.js'
    
    let ddb = new DDB('ws://127.0.0.1:8848')
    
    await ddb.connect()
</script>

Data for code completion and function prompts: https://cdn.dolphindb.cn/assets/docs.en.json
https://cdn.dolphindb.cn/assets/docs.zh.json

DDB options

let ddb = new DDB('ws://127.0.0.1:8848'){

    // Whether to log in automatically after establishing a connection, default `true`
    autologin: true,
    
    // DolphinDB username, default `'admin'`
    username: 'admin',
    
    // DolphinDB password, default `'123456'`
    password: '123456',
    
    // set python session flag, default `false`
    python: false,
    
    // set sql standard flag and  use the SqlStandard enum to pass arguments, default `DolphinDB`
    // sql: SqlStandard.MySQL,
    // sql: SqlStandard.Oracle,
    
    // After setting this option, the database connection is only used for streaming data. For details, see section `Streaming Data`
    streaming: undefined
})

Calling Functions

Example

import { DdbInt } from 'dolphindb'

const result = await ddb.call('add', [new DdbInt(1), new DdbInt(1)])
// TypeScript: const result = await ddb.call<DdbInt>('add', [new DdbInt(1), new DdbInt(1)])

console.log(result.value === 2)  // true

Using DdbObj objects to represent data types in DolphinDB

In the preceding example, two parameters (new DdbInt(1), corresponding to the INT type in DolphinDB) are uploaded to the DolphinDB database as parameters of the add function, then the result of the function call is received.

<DdbInt> is used by TypeScript to infer the type of the returned value

  • result is a DdbInt, which is also a DdbObj<number>
  • result.form is a DdbForm.scalar
  • result.type is a DdbType.int
  • result.value is a native number in JavaScript (the value range and precision of int can be accurately represented by the JavaScript number type)

It is recommended to first understand the concepts related to TypedArray in JavaScript, you can refer to:
https://stackoverflow.com/questions/42416783/where-to-use-arraybuffer-vs-typed-array-in-javascript
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypedArray

/** Represent all data types in DolphinDB */
class DdbObj <T extends DdbValue = DdbValue> {
    /** Little endian or not */
    le: boolean
    
    /** For DolphinDB data forms, see https://ci.dolphindb.cn/en/Programming/DataTypesandStructures/DataForms/DataForms.html */
    form: DdbForm
    
    /** For DolphinDB data types, see https://ci.dolphindb.cn/en/Programming/DataTypesandStructures/DataTypes/DataTypes.html */
    type: DdbType
    
    /** Buffer length during parsing */
    length: number
    
    /** table name / column name */
    name?: string
    
    /**
        Lowest/1st dimension
        - vector: rows = n, cols = 1
        - pair:   rows = 2, cols = 1
        - matrix: rows = n, cols = m
        - set:    the same as vector
        - dict:   include vectors of keys and values
        - table:  the same as matrix
    */
    rows?: number
    
    /** 2nd dimension */
    cols?: number
    
    /** The actual data. Different DdbForm and DdbType use different types in DdbValue to represent actual data. */
    value: T
    
    /** Raw binary data, which is only available to top-level objects generated by parse_message when parse_object is false */
    buffer?: Uint8Array
    
    constructor (data: Partial<DdbObj> & { form: DdbForm, type: DdbType, length: number }) {
        Object.assign(this, data)
    }
}

class DdbInt extends DdbObj<number> {
    constructor (value: number) {
        super({
            form: DdbForm.scalar,
            type: DdbType.int,
            length: 4,
            value
        })
    }
}

// ... There are also many utility classes, such as DdbString, DdbLong, DdbDouble, DdbVectorDouble, DdbVectorAny, etc.

type DdbValue = 
    null | boolean | number | [number, number] | bigint | string | string[] | 
    Uint8Array | Int16Array | Int32Array | Float32Array | Float64Array | BigInt64Array | Uint8Array[] | 
    DdbObj[] | DdbFunctionDefValue | DdbSymbolExtendedValue
    

enum DdbForm {
    scalar = 0,
    vector = 1,
    pair = 2,
    matrix = 3,
    set = 4,
    dict = 5,
    table = 6,
    chart = 7,
    chunk = 8,
}


enum DdbType {
    void = 0,
    bool = 1,
    char = 2,
    short = 3,
    int = 4,
    long = 5,
    // ...
    timestamp = 12,
    // ...
    double = 16,
    symbol = 17,
    string = 18,
    // ...
}

Specifying form and type to manually create a DdbObj object

If there is no shortcut class, you can also specify form and type to manually create a DdbObj object:

// Created by the DdbDateTime shortcut class
new DdbDateTime(1644573600)

// Equivalent to manually creating an object of form = scalar, type = datetime through DdbObj
const obj = new DdbObj({
     form: DdbForm.scalar,
     type: DdbType.datetime,
     value: 1644573600,
     length: 0
})


// The corresponding type and value of the value in js can refer to the result returned by ddb.eval (see the `eval` method declaration below)
const obj = await ddb.eval('2022.02.11 10:00:00')
console.log(obj.form === DdbForm.scalar)
console.log(obj.type === DdbType.datetime)
console.log(obj.value)

// Another example is to create a set
// refer to ddb.eval
// const obj = await ddb.eval('set([1, 2, 3])')
// console.log(obj.value)
const obj = new DdbObj({
     form: DdbForm.set,
     type: DdbType.int,
     value: Int32Array.of(1, 2, 3),
     length: 0
})

// It's easier to use shortcut classes
const obj = new DdbSetInt(
     new Set([1, 2, 3])
)

NULL object in the form of a scalar

For the NULL object in the form of a scalar, the value corresponding to DdbObj is null in JavaScript:

;(await ddb.eval('double()')).value === null

// create NULL object
new DdbInt(null)
new DdbDouble(null)

call Method Declaration

async call <T extends DdbObj> (
    /** Function name */
    func: string,
    
    /** Call arguments (The incoming native string and boolean will be automatically converted to DdbObj<string> and DdbObj<boolean>) */
    args?: (DdbObj | string | boolean)[] = [ ],
    
    /** Calling options */
    options?: {
        /** Urgent flag, used to prevent being blocked by other jobs when using urgent worker for execution */
        urgent?: boolean
        
        /** When the node alias is set, the function is sent to the corresponding node in the cluster for execution (using the rpc method in DolphinDB) */
        node?: string
        
        /** When multiple node aliases are set, functions are sent to the corresponding nodes in the cluster for execution (using the pnodeRun method in DolphinDB) */
        nodes?: string[]
        
        /** It must be passed when setting the parameter node and is not passed in other cases. The function type needs to be specified. */
        func_type?: DdbFunctionType
        
        /** It is an optional attribute, which can be passed when setting the parameter nodes and is not passed in other cases. */
        add_node_alias?: boolean
    } = { }
): Promise<T>

Executing the Script

Example

const result = await ddb.eval(
    'def foo (a, b) {\n' +
    '    return a + b\n' +
    '}\n' +
    'foo(1l, 1l)\n'
)

// TypeScript:
// import type { DdbLong } from 'dolphindb'
// const result = await ddb.eval<DdbLong>(...)

console.log(result.value === 2n)  // true

In the preceding example, a script is uploaded through a string to the DolphinDB database for execution, and the execution result of the last statement foo(1l, 1l) is received.

<DdbLong> is used by TypeScript to infer the type of the return value

  • result is a DdbLong, which is also a DdbObj<bigint>
  • result.form is DdbForm.scalar
  • result.type is DdbType.long
  • result.value is the native bigint in JavaScript (the precision of long cannot be accurately represented by JavaScript number, but it can be represented by bigint)

As long as the WebSocket connection is not disconnected, the custom function foo will always exist in the subsequent session and can be reused, for example, you can use await ddb.call<DdbInt>('foo', [new DdbInt(1), new DdbInt(1)]) to call this custom function.

eval Method Declaration

async eval <T extends DdbObj> (
    /** Script for execution */
    script: string,
    
    /** Calling options */
    options: {
        /** Urgent flag, used to prevent being blocked by other jobs when using urgent worker for execution */
        urgent?: boolean
    } = { }
): Promise<T>

Uploading Variables

Example

import { DdbVectorDouble } from 'dolphindb'

let a = new Array(10000)
a.fill(1.0)

ddb.upload(['bar1', 'bar2'], [new DdbVectorDouble(a), new DdbVectorDouble(a)])

In the preceding example, two variables, bar1 and bar2, are uploaded, and the variable value is a DOUBLE vector of length 10000.

As long as the WebSocket connection is on, the variables bar1 and bar2 will always exist in subsequent sessions and can be reused.

upload Method Declaration

async upload (
    /** Variable names */
    vars: string[],
    
    /** Variable values */
    args: (DdbObj | string | boolean)[]
): Promise<void>

Examples

import { nulls, DdbInt, timestamp2str, DdbVectorSymbol, DdbTable, DdbVectorDouble } from 'dolphindb'

// Format timestamp in DolphinDB as string
timestamp2str(
    (
        await ddb.call('now', [false])
        // TypeScript: await ddb.call<DdbObj<bigint>>('now', [false])
    ).value
) === '2022.02.23 17:23:13.494'

// Create a symbol vector
new DdbVectorSymbol(['aaa', 'aaa', 'aaa', 'aaa', 'aaa', 'bbb'])

// Create a DOUBLE vector containing NULL values using native arrays in JavaScript
new DdbVectorDouble([0.1, null, 0.3])

// Create a DOUBLE vector using JavaScript TypedArray, which is more efficient and saves more memory
let av = new Float64Array(3)
av[0] = 0.1
av[1] = nulls.double
av[2] = 0.3
new DdbVectorDouble(av)

// Create a DdbTable
new DdbTable(
    [
        new DdbVectorDouble([0.1, 0.2, null], 'col0'),
        new DdbVectorSymbol(['a', 'b', 'c'], 'col1')
    ],
    'mytable'
)

Streaming Data

// New Streaming Data Connection Configuration
let sddb = new DDB('ws://192.168.0.43:8800', {
    autologin: true,
    username: 'admin',
    password: '123456',
    streaming: {
        table: 'Stream table name to subscribe to',
        
        // Streaming data processing callback. The type of message is StreamingData.
        handler (message) {
            console.log(message)
        }
    }
})

// Establish the connection
await sddb.connect()

The streaming data received after the connection is established will be used as the message parameter of the handler. The type of the message is StreamingData, as follows:

export interface StreamingParams {
    table: string
    action?: string
    
    handler (message: StreamingData): any
}

export interface StreamingData extends StreamingParams {
    /**
        The message sending time of the server (nano seconds since epoch)  
        std::chrono::system_clock::now().time_since_epoch() / std::chrono::nanoseconds(1)
    */
    time: bigint
    
    /** message id */
    id: bigint
    
    colnames: string[]
    
    /** Subscription topic, which is the name of a subscription.
        It is a string consisting of the alias of the node where the subscription table is located, the stream table name, and the subscription task name (if actionName is specified), separated by `/`
    */
    topic: string
    
    /** Streaming data of the any vector type. Each element corresponds to a column (without name) of the subscribed table, and the content in the column (DdbObj<DdbVectorValue>) is the new data value */
    data: DdbObj<DdbVectorObj[]>
    
    /** Number of the added streaming data rows */
    rows: number
    
    window: {
        /** offset equals 0 when initiating the connection and gradually increases as the window moves */
        offset: number
        
        /** Sum of segment.row in segments */
        rows: number
        
        /** An array of data received each time */
        segments: DdbObj<DdbVectorObj[]>[]
    }
    
    /** After successfully subscribed, if the subsequent message is parsed incorrectly, the error will be set and the handler will be called. */
    error?: Error
}

Development

# Install the latest version of nodejs
# https://nodejs.org/en/download/current/

# Install the pnpm package manager
corepack enable
corepack prepare pnpm@latest --activate

git clone https://github.com/dolphindb/api-javascript.git

cd api-javascript

# Install project dependencies
pnpm install

# Copy .vscode/settings.template.json to .vscode/settings.json
cp .vscode/settings.template.json .vscode/settings.json

# Refer to scripts in package.json

# Construct
pnpm run build

# lint
pnpm run lint

# Test
pnpm run test

# Scan entries
pnpm run scan
# Manually complete untranslated entries
# Run the scan again to update the dictionary file dict.json
pnpm run scan