public class DetectModerationLabelsResult
extends java.lang.Object
implements java.io.Serializable
Constructor and Description |
---|
DetectModerationLabelsResult() |
Modifier and Type | Method and Description |
---|---|
boolean |
equals(java.lang.Object obj) |
HumanLoopActivationOutput |
getHumanLoopActivationOutput()
Shows the results of the human in the loop evaluation.
|
java.util.List<ModerationLabel> |
getModerationLabels()
Array of detected Moderation labels and the time, in milliseconds from
the start of the video, they were detected.
|
java.lang.String |
getModerationModelVersion()
Version number of the moderation detection model that was used to detect
unsafe content.
|
int |
hashCode() |
void |
setHumanLoopActivationOutput(HumanLoopActivationOutput humanLoopActivationOutput)
Shows the results of the human in the loop evaluation.
|
void |
setModerationLabels(java.util.Collection<ModerationLabel> moderationLabels)
Array of detected Moderation labels and the time, in milliseconds from
the start of the video, they were detected.
|
void |
setModerationModelVersion(java.lang.String moderationModelVersion)
Version number of the moderation detection model that was used to detect
unsafe content.
|
java.lang.String |
toString()
Returns a string representation of this object; useful for testing and
debugging.
|
DetectModerationLabelsResult |
withHumanLoopActivationOutput(HumanLoopActivationOutput humanLoopActivationOutput)
Shows the results of the human in the loop evaluation.
|
DetectModerationLabelsResult |
withModerationLabels(java.util.Collection<ModerationLabel> moderationLabels)
Array of detected Moderation labels and the time, in milliseconds from
the start of the video, they were detected.
|
DetectModerationLabelsResult |
withModerationLabels(ModerationLabel... moderationLabels)
Array of detected Moderation labels and the time, in milliseconds from
the start of the video, they were detected.
|
DetectModerationLabelsResult |
withModerationModelVersion(java.lang.String moderationModelVersion)
Version number of the moderation detection model that was used to detect
unsafe content.
|
public java.util.List<ModerationLabel> getModerationLabels()
Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were detected.
Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were detected.
public void setModerationLabels(java.util.Collection<ModerationLabel> moderationLabels)
Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were detected.
moderationLabels
- Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were detected.
public DetectModerationLabelsResult withModerationLabels(ModerationLabel... moderationLabels)
Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were detected.
Returns a reference to this object so that method calls can be chained together.
moderationLabels
- Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were detected.
public DetectModerationLabelsResult withModerationLabels(java.util.Collection<ModerationLabel> moderationLabels)
Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were detected.
Returns a reference to this object so that method calls can be chained together.
moderationLabels
- Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were detected.
public java.lang.String getModerationModelVersion()
Version number of the moderation detection model that was used to detect unsafe content.
Version number of the moderation detection model that was used to detect unsafe content.
public void setModerationModelVersion(java.lang.String moderationModelVersion)
Version number of the moderation detection model that was used to detect unsafe content.
moderationModelVersion
- Version number of the moderation detection model that was used to detect unsafe content.
public DetectModerationLabelsResult withModerationModelVersion(java.lang.String moderationModelVersion)
Version number of the moderation detection model that was used to detect unsafe content.
Returns a reference to this object so that method calls can be chained together.
moderationModelVersion
- Version number of the moderation detection model that was used to detect unsafe content.
public HumanLoopActivationOutput getHumanLoopActivationOutput()
Shows the results of the human in the loop evaluation.
Shows the results of the human in the loop evaluation.
public void setHumanLoopActivationOutput(HumanLoopActivationOutput humanLoopActivationOutput)
Shows the results of the human in the loop evaluation.
humanLoopActivationOutput
- Shows the results of the human in the loop evaluation.
public DetectModerationLabelsResult withHumanLoopActivationOutput(HumanLoopActivationOutput humanLoopActivationOutput)
Shows the results of the human in the loop evaluation.
Returns a reference to this object so that method calls can be chained together.
humanLoopActivationOutput
- Shows the results of the human in the loop evaluation.
public java.lang.String toString()
toString
in class java.lang.Object
Object.toString()
public int hashCode()
hashCode
in class java.lang.Object
public boolean equals(java.lang.Object obj)
equals
in class java.lang.Object
Copyright © 2018 Amazon Web Services, Inc. All Rights Reserved.