I need to store a DateTime value which is sent to the Database from a C# application which is using DateTime.UtcNow. If I save it to the DateTime column, the milliseconds value are always 000. But while debugging from the application, the milliseconds value is sent from the application to the database. What am I missing?
Here's no pleasant way to use
SQL Server only stores time to approximately 1/300th of a second. These always fall on the 0, 3 and 7 milliseconds
SQL Server 2008 has much more precision available. The
datetime2 datatype will accurately store values like this: