Setting timezone programatically only works for + UTC timezones
I've written the following code to programatically set timezones on my machine. It works fine if I use a positive UTC time, New Zealand Standard Time for example. If I use a negative UTC time, such as Mountain Standard Time, the code runs without errors, but the timezone is set to International Date Line west (-12:00).
Did I miss something?
Here is the code I'm using:
[StructLayout(LayoutKind.Sequential, CharSet = CharSet.Unicode)]
public struct TimeZoneInformation
{
public int Bias;
[MarshalAs(UnmanagedType.ByValTStr, SizeConst = 32)]
public string StandardName;
public SystemTime StandardDat开发者_如何转开发e;
public int StandardBias;
[MarshalAs(UnmanagedType.ByValTStr, SizeConst = 32)]
public string DaylightName;
public SystemTime DaylightDate;
public int DaylightBias;
public static TimeZoneInformation FromTimeZoneInfo(TimeZoneInfo timeZoneInfo)
{
var timeZoneInformation = new TimeZoneInformation();
timeZoneInformation.StandardName = timeZoneInfo.StandardName;
timeZoneInformation.DaylightName = timeZoneInfo.DaylightName;
var timeZoneRegistryPath = @"HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Time Zones\" + timeZoneInfo.Id;
var tzi = (byte[])Microsoft.Win32.Registry.GetValue(timeZoneRegistryPath, "TZI", new byte[] {});
if (tzi == null || tzi.Length != 44)
{
throw new ArgumentException("Invalid REG_TZI_FORMAT");
}
timeZoneInformation.Bias = BitConverter.ToInt32(tzi, 0);
timeZoneInformation.StandardBias = BitConverter.ToInt32(tzi, 4);
timeZoneInformation.DaylightBias = BitConverter.ToInt32(tzi, 8);
timeZoneInformation.StandardDate.Year = BitConverter.ToInt16(tzi, 12);
timeZoneInformation.StandardDate.Month = BitConverter.ToInt16(tzi, 14);
timeZoneInformation.StandardDate.DayOfWeek = BitConverter.ToInt16(tzi, 0x10);
timeZoneInformation.StandardDate.Day = BitConverter.ToInt16(tzi, 0x12);
timeZoneInformation.StandardDate.Hour = BitConverter.ToInt16(tzi, 20);
timeZoneInformation.StandardDate.Minute = BitConverter.ToInt16(tzi, 0x16);
timeZoneInformation.StandardDate.Second = BitConverter.ToInt16(tzi, 0x18);
timeZoneInformation.StandardDate.Millisecond = BitConverter.ToInt16(tzi, 0x1a);
timeZoneInformation.DaylightDate.Year = BitConverter.ToInt16(tzi, 0x1c);
timeZoneInformation.DaylightDate.Month = BitConverter.ToInt16(tzi, 30);
timeZoneInformation.DaylightDate.DayOfWeek = BitConverter.ToInt16(tzi, 0x20);
timeZoneInformation.DaylightDate.Day = BitConverter.ToInt16(tzi, 0x22);
timeZoneInformation.DaylightDate.Hour = BitConverter.ToInt16(tzi, 0x24);
timeZoneInformation.DaylightDate.Minute = BitConverter.ToInt16(tzi, 0x26);
timeZoneInformation.DaylightDate.Second = BitConverter.ToInt16(tzi, 40);
timeZoneInformation.DaylightDate.Millisecond = BitConverter.ToInt16(tzi, 0x2a);
return timeZoneInformation;
}
}
[DllImport("kernel32.dll", SetLastError = true)]
public static extern bool SetTimeZoneInformation([In] ref TimeZoneInformation timeZoneInformation);
var t = TimeZoneInformation.FromTimeZoneInfo(TimeZoneInfo.FindSystemTimeZoneById("Mountain Standard Time"));
SetTimeZoneInformation(ref t);
In public struct TimeZoneinformation
I defined Bias
and StandardBias
as long
instead of int
.
Long
in the CLR is always a 64 bit value, unlike C++ where it is usually though not always 32bits. This increased the size of my structure by a total of 64 bits and caused the native code to misinterpret the values it saw. It was purely by accident that +UTC timezones worked.
I've corrected the code above, and it successfully sets the timezone if anyone is interested.
精彩评论