开发者

simulate mouse cursor move in c# between two coordinates

I am trying to move mouse programatically between two coordinates. But I want to maintain the speed reliably on all fast or slow processing machines. I saw this link here. But it doesn't gurantee me the optimum, smooth, and visible speed of the cursor when simulating the move between two coordinates. I wonder if anyone knows a trick to determine the parameters like delay and steps optimum value for various machines Like my first idea was use for-loop for specific iteraton to determine the perfomance of the machine then grade the parameters based on how much time the for-loop took ...an idea? or Am i totally wrong on this? Th开发者_开发知识库anks


You should make the motion a function of time. Starting with the answer at C# moving the mouse around realistically, and using the Stopwatch class to measure the elapsed time:

public void LinearSmoothMove(Point newPosition, TimeSpan duration) 
{
    Point start = GetCursorPosition();

    // Find the vector between start and newPosition
    double deltaX = newPosition.X - start.X;
    double deltaY = newPosition.Y - start.Y;

    // start a timer
    Stopwatch stopwatch = new Stopwatch();
    stopwatch.Start();

    double timeFraction = 0.0;

    do
    {
        timeFraction = (double)stopwatch.Elapsed.Ticks / duration.Ticks;
        if (timeFraction > 1.0)
            timeFraction = 1.0;

        PointF curPoint = new PointF(start.X + timeFraction * deltaX, 
                                     start.Y + timeFraction * deltaY);
        SetCursorPosition(Point.Round(curPoint));
        Thread.Sleep(20);
    } while (timeFraction < 1.0);
}


I would recommend some physics. Speed is distance divided by time. If you want a consant mouse speed on every machine you have to get an accurate time.

Let's make an example:

You want to move the mouse from point 0/0 to 400/600 and the endpoint should always be reached after 3 seconds.

Therfore you have to save the start time and build a while loop wich will end at starttime + 3s. In the loop you calculate the X and Y coordinates from the elapsed and the total time.

X = 400 / 3s * ElapsedTime
Y = 600 / 3s * ElapsedTime

This will be machine independ. For a good result you should use a high accurate time like Stopwatch.


I tried this one but still not optimum. It still varies with the machine processing power.@Justin, use a different value for duration and sleep time. Let me know if you come up with a better solution after you tested it.Thanks!

using System;
using System.Drawing;
using System.Runtime.InteropServices;
using System.Windows.Forms;
using System.Diagnostics;
using System.Threading;

namespace ConsoleApplication11
{
   class Program
    {

       [DllImport("user32.dll")]
       static extern bool SetCursorPos(int X, int Y);   

       public static void LinearSmoothMove(Point newPosition, TimeSpan duration)
       {
           Point start = Cursor.Position;
           int sleep = 10;
           //PointF iterPoint = start;
           // Find the vector between start and newPosition   
           double deltaX = newPosition.X - start.X;
           double deltaY = newPosition.Y - start.Y;
           // start a timer    
           Stopwatch stopwatch = new Stopwatch();
           stopwatch.Start();
           double timeFraction = 0.0;
           do
           {
               timeFraction = (double)stopwatch.Elapsed.Ticks / duration.Ticks;
               if (timeFraction > 1.0)
                timeFraction = 1.0;
               PointF curPoint = new PointF((float)(start.X + timeFraction * deltaX), (float)(start.Y + timeFraction * deltaY));
               SetCursorPos(Point.Round(curPoint).X, Point.Round(curPoint).Y);
               Thread.Sleep(sleep);
           } while (timeFraction < 1.0);
       }
       static void Main(string[] args)
       {
             TimeSpan delayt = new  TimeSpan(0, 0, 3);
           LinearSmoothMove(new Point(20, 40), delayt);
           Console.Read();
        }       
    }
}
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜