SendMessage to simulate right-click crashes the target app
I'm writing a C# automation tool.
Since Microsoft UI Automation doesn't provide any way of simulating right-clicks or raising context menus, I'm using SendMessage
to do this instead. I'd rather not use SendInput
because I don't want to have to grab focus.
When I call SendMessage
, however, it crashes the target app.
Here's my code:
[DllImport("user32.dll", CharSet = CharSet.Auto)]
static extern IntPtr SendMessage(IntPtr hWnd, UInt32 Msg, IntPtr wParam, IntPtr lParam);
public void RightClick<T>(T element) where T: AutomationElementWrapper
{
const int MOUSEEVENTF_RIGHTDOWN = 0x0008; /* right button down */
const int MOUSEEVENTF_RIGHTUP = 0x0010; /* right button up */
var point = element.Element.GetClickablePoint();
var processId = element.Element.GetCurrentPropertyValue(Automat开发者_运维问答ionElement.ProcessIdProperty);
var window = AutomationElement.RootElement.FindFirst(
TreeScope.Children,
new PropertyCondition(AutomationElement.ProcessIdProperty,
processId));
var handle = window.Current.NativeWindowHandle;
var x = point.X;
var y = point.Y;
var value = ((int)x)<<16 + (int)y;
SendMessage(new IntPtr(handle), MOUSEEVENTF_RIGHTDOWN, IntPtr.Zero, new IntPtr(value));
SendMessage(new IntPtr(handle), MOUSEEVENTF_RIGHTUP, IntPtr.Zero, new IntPtr(value));
}
Any idea what I'm doing wrong?
You're mixing up your types. You're using SendMessage
, which takes a window message (which by convention are named WM_
...), but you're passing it a MOUSEINPUT.dwFlags
value that's meant for SendInput
(which are named MOUSEEVENTF_
...). You're basically passing gibberish.
What your code is actually doing is sending a window message whose numeric value is 8 (which, in window messages, means WM_KILLFOCUS
), followed by a window message of 0x10 == 16 (WM_CLOSE
). It's the latter that's likely causing you problems -- you're telling the window to close. I'm not sure why it would crash, but it would certainly exit.
If you're using SendMessage
, you need to pass it window messages (WM_
, for example WM_RBUTTONDOWN
and WM_RBUTTONUP
).
This reply is long enough I'll put it in an answer slot. I think my answer is basically that the question you are asking is based on an incorrect assumption.
The key issue is that while it's fine to run API-type tests in the background on the same desktop that you're working on, it rarely works out well to do the same for UI-based tests.
Your best bet for long-running UI tests are two machines and a keyboard/monitor switch, or use terminal services to run the test app in a session of its own, so it can have its own view of the world (focus, mouse, keyboard state) that won't interfere with the desktop you're working on.
The fundamental issue is that some UI resources - notably the mouse pointer, and keyboard focus - are shared among all apps on the desktop. And many (most? all?) apps assume that when they're being interacted with, they can do as they please with these.
You can sometimes get away with 'lying' to an app and sending it messages that would usually be the end result of input (such as sending WM_LBUTTONDOWN instead of doing sendinput), but if the app ends up looking at the global mouse state, you'll end up with an inconsistency.
For example, an app might respond to WM_LBUTTONDOWN by using the coords passed as parameters. Or it might ignore them and call GetCursorPos instead - and that could lead to really strange behavior if the mouse is really over your email program instead of the app.
Or you might send a WM_LBUTTONDOWN, and the app responds to it by calling some helper function. The helper function uses GetKeyState(VK_LBUTTON) to check if the mouse button is actually down - notices that it isn't, so bails early.
(Also, sending the end-result message bypasses other stuff that the app might be relying on; if you send keys directly to a window, you'll bypass much of the accelerator and dialog handling code that's usually in the message loop.)
If the app uses SetCapture() - which is very common for things-that-can-be-clicked, such as buttons and the like - it will fail if the app doesn't have focus. You might get lucky and the app will ignore the failure and luck out - or you might not. Menu-type controls often assume that the app has focus, and will dismiss themselves if they notice that focus is actually elsewhere...
If you own the app that's being tested, you might be able to take this into account and write it such that it can be 'tested' in the background: but be aware that it's no longer running in a manner that's consistent with actual user interaction - so arguably it's not a valid user-equivalent test case! - that's up to you to figure out given your test requirements.
Long story short: you might be able to get something to work here in this specific case, but be aware that there's a whole bunch of issues lurking here, and note that this is definitely not considered to be a UI testing best practice!
What messages exactly do you send, can you give their Windows definition? According to MSDN:
#define WM_RBUTTONDOWN 0x0204 #define WM_RBUTTONUP 0x0205
You didn't post the details of the crash (exception type? Location?) but my first suspicion is a reentrancy issue. I would try using PostMessage instead of SendMessage. SendMessage is synchronous and waits for the message to be processed before returning, so things get executed during its call. PostMessage just puts the message in the queue and then returns, and the processing takes place afterwards.
精彩评论