AWSRekognitionDetectModerationLabelsRequest

Objective-C

@interface AWSRekognitionDetectModerationLabelsRequest

Swift

class AWSRekognitionDetectModerationLabelsRequest
  • Sets up the configuration for human evaluation, including the FlowDefinition the image will be sent to.

    Declaration

    Objective-C

    @property (nonatomic, strong) AWSRekognitionHumanLoopConfig *_Nullable humanLoopConfig;

    Swift

    var humanLoopConfig: AWSRekognitionHumanLoopConfig? { get set }
  • The input image as base64-encoded bytes or an S3 object. If you use the AWS CLI to call Amazon Rekognition operations, passing base64-encoded image bytes is not supported.

    If you are using an AWS SDK to call Amazon Rekognition, you might not need to base64-encode image bytes passed using the Bytes field. For more information, see Images in the Amazon Rekognition developer guide.

    Declaration

    Objective-C

    @property (nonatomic, strong) AWSRekognitionImage *_Nullable image;

    Swift

    var image: AWSRekognitionImage? { get set }
  • Specifies the minimum confidence level for the labels to return. Amazon Rekognition doesn’t return any labels with a confidence level lower than this specified value.

    If you don’t specify MinConfidence, the operation returns labels with confidence values greater than or equal to 50 percent.

    Declaration

    Objective-C

    @property (nonatomic, strong) NSNumber *_Nullable minConfidence;

    Swift

    var minConfidence: NSNumber? { get set }
  • Identifier for the custom adapter. Expects the ProjectVersionArn as a value. Use the CreateProject or CreateProjectVersion APIs to create a custom adapter.

    Declaration

    Objective-C

    @property (nonatomic, strong) NSString *_Nullable projectVersion;

    Swift

    var projectVersion: String? { get set }