SDK, c#, and SQL Server Datatypes

Looking at the Data API v2 Historical Data documentation online, prices are described as ‘number’ with decimal places ranging from 0 to 6. Looking at the c# SDK IBar interface (for example), all numeric types are ‘decimal’. Looking at SQL Server, the default decimal datatype is (18,0). Does anyone have a recommendation for which c# and SQL Server datatypes to use to minimize rounding inconsistencies?

There is no direct mapping between .NET and MS SQL Decimal data types. The .NET data type can select precision dynamically but the SQL version has a fixed scale, selected during column creation. For this particular case, I’ll recommend the decimal(19,6) SQL data type as a good compromise between storage size and precision. Or if you worry about really big prices you can use the decimal(28,6) SQL type. Both will map into the .NET Decimal data type without any rounding or data losses.

Thanks Oleg. I was hoping for an Alpaca ‘best practice’ on how to address rounding in the c# and SQL layers when using the SDK. It appears that it is up to we developers to address this the best way we see fit for the problem at hand. I stepped back and looked at my problem statement and quickly came to the realization that performance is more important than the 3rd, 4th, etc decimal point. So I created a small benchmark to compare ‘double’ to ‘decimal’. The native ‘double’ datatype performs 5 times faster than the struct ‘decimal’ datatype. For me, using the native ‘double’ datatype is a no-brainer. Thanks again for your reply.

Double is of course faster but rounding issues will hit you very fast if you will do a lot of math with source prices. Just don’t forget about it - sometimes it’s not critical but in some cases it can be important.

Agreed… But, in my experience, slippage typically dwarfs decimal precision. If my model only works at 3+ decimal places, I have bigger problems :wink: