2012-04-04 56 views
2

是否有可能检测到每个被触摸的像素?更具体地说,当用户触摸屏幕时,是否有可能跟踪用户触摸的点集群的所有x-y坐标?我如何分辨用户用拇指进行绘图时以及用手指尖进行绘制时的区别?我想根据用户触摸屏幕来反映画笔的差异,并且还想跟踪所有正在触摸的像素。从触摸点获取单个像素

我目前使用下面的代码从苹果开发者网站的GLPaint样本:

http://developer.apple.com/library/ios/#samplecode/GLPaint/Introduction/Intro.html

样本代码允许预定义的画笔大小绘制和跟踪沿途的X-Y坐标。如何根据用户触摸屏幕的方式更改画笔并随着时间的推移跟踪所有正在触摸的像素?

// Drawings a line onscreen based on where the user touches 

- (void) renderLineFromPoint:(CGPoint)start toPoint:(CGPoint)end 

{ 
    NSLog(@"x:%f y:%f",start.x, start.y); 

    static GLfloat*   vertexBuffer = NULL; 

    static NSUInteger  vertexMax = 64; 

    NSUInteger    vertexCount = 0, 

        count, 

        i; 



    [EAGLContext setCurrentContext:context]; 

    glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer); 



    // Convert locations from Points to Pixels 

    CGFloat scale = self.contentScaleFactor; 

    start.x *= scale; 

    start.y *= scale; 

    end.x *= scale; 

    end.y *= scale; 



    // Allocate vertex array buffer 

    if(vertexBuffer == NULL) 

      vertexBuffer = malloc(vertexMax * 2 * sizeof(GLfloat)); 



    // Add points to the buffer so there are drawing points every X pixels 

    count = MAX(ceilf(sqrtf((end.x - start.x) * (end.x - start.x) + (end.y - start.y) * (end.y - start.y))/kBrushPixelStep), 1); 

    for(i = 0; i < count; ++i) { 

      if(vertexCount == vertexMax) { 

       vertexMax = 2 * vertexMax; 

       vertexBuffer = realloc(vertexBuffer, vertexMax * 2 * sizeof(GLfloat)); 

      } 



      vertexBuffer[2 * vertexCount + 0] = start.x + (end.x - start.x) * ((GLfloat)i/(GLfloat)count); 

      vertexBuffer[2 * vertexCount + 1] = start.y + (end.y - start.y) * ((GLfloat)i/(GLfloat)count); 

      vertexCount += 1; 

    } 



    // Render the vertex array 

    glVertexPointer(2, GL_FLOAT, 0, vertexBuffer); 

    glDrawArrays(GL_POINTS, 0, vertexCount); 



    // Display the buffer 

    glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer); 

    [context presentRenderbuffer:GL_RENDERBUFFER_OES]; 

} 


// Handles the start of a touch 

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event 

{ 

    CGRect     bounds = [self bounds]; 

    UITouch*  touch = [[event touchesForView:self] anyObject]; 

    firstTouch = YES; 

    // Convert touch point from UIView referential to OpenGL one (upside-down flip) 

    location = [touch locationInView:self]; 

    location.y = bounds.size.height - location.y; 

} 

// Handles the continuation of a touch. 

- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event 

{ 

    CGRect     bounds = [self bounds]; 

    UITouch*    touch = [[event touchesForView:self] anyObject]; 



    // Convert touch point from UIView referential to OpenGL one (upside-down flip) 

    if (firstTouch) { 

      firstTouch = NO; 

      previousLocation = [touch previousLocationInView:self]; 

      previousLocation.y = bounds.size.height - previousLocation.y; 

    } else { 

      location = [touch locationInView:self]; 

     location.y = bounds.size.height - location.y; 

      previousLocation = [touch previousLocationInView:self]; 

      previousLocation.y = bounds.size.height - previousLocation.y; 

    } 



    // Render the stroke 

    [self renderLineFromPoint:previousLocation toPoint:location]; 

} 



// Handles the end of a touch event when the touch is a tap. 

- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event 

{ 

    CGRect     bounds = [self bounds]; 

    UITouch*  touch = [[event touchesForView:self] anyObject]; 

    if (firstTouch) { 

      firstTouch = NO; 

      previousLocation = [touch previousLocationInView:self]; 

      previousLocation.y = bounds.size.height - previousLocation.y; 

      [self renderLineFromPoint:previousLocation toPoint:location]; 

    } 

} 


// Handles the end of a touch event. 

- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event 

{ 

    // If appropriate, add code necessary to save the state of the application. 

    // This application is not saving state. 

} 

回答

1

AFAIK没有API访问触摸区的触摸。鉴于电容式触摸屏的限制,我甚至不确定,无论您想要的是物理上的可能。我记得最近在Cocoa Heads上进行了一次演示,演示了一些信息在OS X上(通过私有API)可用于触控板,但不适用于iOS。

我相信这是一个绘图板采用的是拥有自己的传感器技术,内置的专用手写笔的原因之一。

的部分解决方案,为绘图应用程序,可能是模仿“墨”如一些桌面应用程序做:如果用户的触摸在特定点中徘徊,就像油墨从“笔”中出来并逐渐扩散通过“纸”一样。

+0

非常感谢您的回复。你知道这是否可能在Android上? – HappyAppDeveloper 2012-04-04 17:06:41

+0

我不知道,对不起,我不为Android开发。不过,我确信其他人会有答案。如果您在搜索中找不到任何内容,我建议发布标记为Android的新问题。 – 2012-04-04 17:10:20

1

iPad中的Broadcomm硬件以64 Hz扫描屏幕。它通过在构成触摸屏电极的39个透明导电层上连续放置400μs的信号来实现。如果您的手指在0.015625秒内移动了超过一个像素距离(很可能),硬件无法检测到这一点,因为它正忙于测量屏幕的其他部分以获取更多触摸事件。

无论iOS还是Android,这都是一样的。廉价的Android平板电脑和大屏幕的扫描速度降低,所以它们的触摸事件间隔更大。

Wacom平板电脑以100Hz运行数字转换器,所以点的顺序会更精细,但仍会错过触控笔在两次测量之间触摸的像素。