AWSRekognitionFaceDetail
Objective-C
@interface AWSRekognitionFaceDetail
Swift
class AWSRekognitionFaceDetail
Structure containing attributes of the face that the algorithm detected.
A FaceDetail
object contains either the default facial attributes or all facial attributes. The default attributes are BoundingBox
, Confidence
, Landmarks
, Pose
, and Quality
.
GetFaceDetection is the only Amazon Rekognition Video stored video operation that can return a FaceDetail
object with all attributes. To specify which attributes to return, use the FaceAttributes
input parameter for StartFaceDetection. The following Amazon Rekognition Video operations return only the default attributes. The corresponding Start operations don’t have a FaceAttributes
input parameter:
GetCelebrityRecognition
GetPersonTracking
GetFaceSearch
The Amazon Rekognition Image DetectFaces and IndexFaces operations can return all facial attributes. To specify which attributes to return, use the Attributes
input parameter for DetectFaces
. For IndexFaces
, use the DetectAttributes
input parameter.
-
The estimated age range, in years, for the face. Low represents the lowest estimated age and High represents the highest estimated age.
Declaration
Objective-C
@property (nonatomic, strong) AWSRekognitionAgeRange *_Nullable ageRange;
Swift
var ageRange: AWSRekognitionAgeRange? { get set }
-
Indicates whether or not the face has a beard, and the confidence level in the determination.
Declaration
Objective-C
@property (nonatomic, strong) AWSRekognitionBeard *_Nullable beard;
Swift
var beard: AWSRekognitionBeard? { get set }
-
Bounding box of the face. Default attribute.
Declaration
Objective-C
@property (nonatomic, strong) AWSRekognitionBoundingBox *_Nullable boundingBox;
Swift
var boundingBox: AWSRekognitionBoundingBox? { get set }
-
Confidence level that the bounding box contains a face (and not a different object such as a tree). Default attribute.
Declaration
Objective-C
@property (nonatomic, strong) NSNumber *_Nullable confidence;
Swift
var confidence: NSNumber? { get set }
-
The emotions that appear to be expressed on the face, and the confidence level in the determination. The API is only making a determination of the physical appearance of a person’s face. It is not a determination of the person’s internal emotional state and should not be used in such a way. For example, a person pretending to have a sad face might not be sad emotionally.
Declaration
Objective-C
@property (nonatomic, strong) NSArray<AWSRekognitionEmotion *> *_Nullable emotions;
Swift
var emotions: [AWSRekognitionEmotion]? { get set }
-
Indicates the direction the eyes are gazing in, as defined by pitch and yaw.
Declaration
Objective-C
@property (nonatomic, strong) AWSRekognitionEyeDirection *_Nullable eyeDirection;
Swift
var eyeDirection: AWSRekognitionEyeDirection? { get set }
-
Indicates whether or not the face is wearing eye glasses, and the confidence level in the determination.
Declaration
Objective-C
@property (nonatomic, strong) AWSRekognitionEyeglasses *_Nullable eyeglasses;
Swift
var eyeglasses: AWSRekognitionEyeglasses? { get set }
-
Indicates whether or not the eyes on the face are open, and the confidence level in the determination.
Declaration
Objective-C
@property (nonatomic, strong) AWSRekognitionEyeOpen *_Nullable eyesOpen;
Swift
var eyesOpen: AWSRekognitionEyeOpen? { get set }
-
FaceOccluded
should return “true” with a high confidence score if a detected face’s eyes, nose, and mouth are partially captured or if they are covered by masks, dark sunglasses, cell phones, hands, or other objects.FaceOccluded
should return “false” with a high confidence score if common occurrences that do not impact face verification are detected, such as eye glasses, lightly tinted sunglasses, strands of hair, and others.Declaration
Objective-C
@property (nonatomic, strong) AWSRekognitionFaceOccluded *_Nullable faceOccluded;
Swift
var faceOccluded: AWSRekognitionFaceOccluded? { get set }
-
The predicted gender of a detected face.
Declaration
Objective-C
@property (nonatomic, strong) AWSRekognitionGender *_Nullable gender;
Swift
var gender: AWSRekognitionGender? { get set }
-
Indicates the location of landmarks on the face. Default attribute.
Declaration
Objective-C
@property (nonatomic, strong) NSArray<AWSRekognitionLandmark *> *_Nullable landmarks;
Swift
var landmarks: [AWSRekognitionLandmark]? { get set }
-
Indicates whether or not the mouth on the face is open, and the confidence level in the determination.
Declaration
Objective-C
@property (nonatomic, strong) AWSRekognitionMouthOpen *_Nullable mouthOpen;
Swift
var mouthOpen: AWSRekognitionMouthOpen? { get set }
-
Indicates whether or not the face has a mustache, and the confidence level in the determination.
Declaration
Objective-C
@property (nonatomic, strong) AWSRekognitionMustache *_Nullable mustache;
Swift
var mustache: AWSRekognitionMustache? { get set }
-
Indicates the pose of the face as determined by its pitch, roll, and yaw. Default attribute.
Declaration
Objective-C
@property (nonatomic, strong) AWSRekognitionPose *_Nullable pose;
Swift
var pose: AWSRekognitionPose? { get set }
-
Identifies image brightness and sharpness. Default attribute.
Declaration
Objective-C
@property (nonatomic, strong) AWSRekognitionImageQuality *_Nullable quality;
Swift
var quality: AWSRekognitionImageQuality? { get set }
-
Indicates whether or not the face is smiling, and the confidence level in the determination.
Declaration
Objective-C
@property (nonatomic, strong) AWSRekognitionSmile *_Nullable smile;
Swift
var smile: AWSRekognitionSmile? { get set }
-
Indicates whether or not the face is wearing sunglasses, and the confidence level in the determination.
Declaration
Objective-C
@property (nonatomic, strong) AWSRekognitionSunglasses *_Nullable sunglasses;
Swift
var sunglasses: AWSRekognitionSunglasses? { get set }