我已经使UIPanGestureRecognizer只检测大多数垂直平底锅,我如何使它只能检测到真正的垂直平底锅?

我刚刚实现了这个:

- (BOOL)gestureRecognizerShouldBegin:(UIPanGestureRecognizer *)panGestureRecognizer { CGPoint translation = [panGestureRecognizer translationInView:someView]; return fabs(translation.y) > fabs(translation.x); } 

(如此处所述)

但是,如果用户垂直摆动对angular线,它将开始。 我如何让宽容更严格,认为是垂直的?

基本上,下面的图片描述了我之后。 第一个图是现在检测到的,这个区域内的任何东西,第二个是我想要的。

在这里输入图像说明

您可以使用atan2f给定xy值来计算垂直angular度。 例如,如果angular度与垂直angular度小于4度,要开始手势,可以这样做:

 - (BOOL)gestureRecognizerShouldBegin:(UIPanGestureRecognizer *)gesture { CGPoint translation = [gesture translationInView:gesture.view]; if (translation.x != 0 || translation.y != 0) { CGFloat angle = atan2f(fabs(translation.x), fabs(translation.y)); return angle < (4.0 * M_PI / 180.0); // four degrees, but in radians } return FALSE; } 

检测纯垂直手势,我假设translation.x == 0然后。

您也应该检查您引用的post中的正确答案。 他在哪里比较以前的位置和当前的位置。 你可以创造感性。 你可以检查我的项目 ,例如,看看我在哪里使用这种灵敏度来定义,当一个行动是有效的(小于或等于感性)或无效的(大于感性)。 检查RPSliderViewController.mMOVEMENT_SENSIBILITY

我已经为此写了一个UIGestureRecognizer子类。 它只跟踪垂直翻译。 也许这对你有帮助 您可以将其用作任何其他手势识别器,只需设置阈值并在其目标的操作方法中跟踪翻译。

VerticalPanGestureRecognizer.h

 #import <UIKit/UIKit.h> #import <UIKit/UIGestureRecognizerSubclass.h> @interface VerticalPanGestureRecognizer : UIGestureRecognizer @property (assign, nonatomic)float translation; @property (assign, nonatomic)float offsetThreshold; @end 

VerticalPanGestureRecognizer.m

 #import "VerticalPanGestureRecognizer.h" @interface VerticalPanGestureRecognizer () { CGPoint _startPoint; } @end @implementation VerticalPanGestureRecognizer - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { if ([touches count] > 1) { self.state = UIGestureRecognizerStateFailed; } else { _startPoint = [[touches anyObject] locationInView:self.view]; } } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { if (self.state == UIGestureRecognizerStateFailed || self.state == UIGestureRecognizerStateCancelled) { return; } CGPoint currentLocation = [[touches anyObject] locationInView:self.view]; CGPoint translation; translation.x = currentLocation.x - _startPoint.x; translation.y = currentLocation.y - _startPoint.y; if (self.state == UIGestureRecognizerStatePossible) { //if the x-translation is above our threshold the gesture fails if (fabsf(translation.x) > self.offsetThreshold) self.state = UIGestureRecognizerStateFailed; //if the y-translation has reached the threshold the gesture is recognized and the we start sending action methods else if (fabsf(translation.y) > self.offsetThreshold) self.state = UIGestureRecognizerStateBegan; return; } //if we reached this point the gesture was succesfully recognized so we now enter changed state self.state = UIGestureRecognizerStateChanged; //we are just insterested in the vertical translation self.translation = translation.y; } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { //if at this point the state is still 'possible' the threshold wasn't reached at all so we fail if (self.state == UIGestureRecognizerStatePossible) { self.state = UIGestureRecognizerStateFailed; } else { CGPoint currentLocation = [[touches anyObject] locationInView:self.view]; CGPoint translation; translation.x = _startPoint.x - currentLocation.x; translation.y = _startPoint.y - currentLocation.y; self.translation = translation.y; self.state = UIGestureRecognizerStateEnded; } } - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event { self.state = UIGestureRecognizerStateCancelled; } - (void)reset { [super reset]; _startPoint = CGPointZero; } @end