1.30.19

1.30.19.4

New Features

  • Added error message when the specified value for batchSize is a decimal in stream subscription.

Improvements

  • When subscribing to a stream table, if msgAsTable=True and batchSize is a positive integer, the messages will be processed by block.

  • Python API now supports NumPy up to version 1.23.4, and pandas 1.5.2.

  • Error message enhancements for data uploads.

  • Error message enhancements for Python API on MacOS.

Bug Fixes

  • Fixed an error when downloading data containing timestamps before 1970.

  • Fixed a failure when writing data containing columns of type INT128/IPADDR/UUID/BLOB through tableAppender, tableUpsert and PartitionedTableAppender.

  • Fixed server memory leak caused by undestroyed temporary database handle or table handle when deleting a partition with s.dropPartition or loading a table with s.loadTable.

1.30.19.3

New Features

  • Added new setTimeOut method to the session class for configuring the TCP connection option TCP_USER_TIMEOUT. The method is only available on Linux.

  • Added new parameter sortKeyMappingFunction to the createPartitionedTable method for dimensionality reduction of sort keys.

Improvements

  • You can now upload a DataFrame in the specified data type by setting its __DolphinDB_Type__ attribute.

Bug Fixes

  • Fixed an issue where the uploading result of a Boolean object was incorrect.

1.30.19.2

New Features

  • Support function hints.

  • Support official Python 3.8-3.9 on Windows.

  • Support uploading data with function runTaskAsync of DBConnectionPool.

  • Added new method enableJobCancellation to session on Linux. You can use Ctrl+C to cancel all tasks of session.run() that are being executed.

  • Support Python 3.7-3.9 in conda environment on Linux aarch64.

Bug Fixes

  • Fixed an issue that the server does not automatically release resources after a Table object is deleted.

1.30.19.1

New Features

  • Added new system variable version. You can check the version number of the API through dolphindb._version_.

  • tableAppender now supports array vectors.

  • Support the reconnect parameter for session.connect to reconnect nodes automatically in scenarios where high availability is not enabled.

  • Added new class streamDeserializer to parse the heterogeneous stream table. Added the streamDeserializer parameter for function subscribe to receive data parsed by streamDeserializer.

  • When calling print() with s.run(), the result can now be displayed on the API side.

  • Added new object tableUpsert.

  • Added new parameters mode and modeOPtion for MultithreadedTableWriter to update the indexed table, keyed table, or DFS table through upsert.

  • Support uploading and reading array vectors of INT128, UUID, and IP types. Please set enablePickle=false before you upload or read array vectors of these types.

Improvements

  • Standardized the handling of null values.

  • The enableASYN parameter of session object is deprecated. Please use enableASYNC instead.

  • Achieved load balancing of requests when connecting to the cluster through the API.

  • When writting to an in-memory table with MultithreadedTableWriter, dbPath must be set to NULL, and tableName must be specified as the in-memory table name.

Bug Fixes

  • Fixed an issue that the data cannot be downloaded when the data queried through the API has garbled characters.

  • Fixed an issue that the port is not released in time after the session is closed.