SafeSearchAnnotation class Null safety

Set of features pertaining to the image, computed by computer vision methods over safe-search verticals (for example, adult, spoof, medical, violence).

Annotations

Constructors

SafeSearchAnnotation({SafeSearchDetectionType adult = SafeSearchDetectionType.UNKNOWN, SafeSearchDetectionType medical = SafeSearchDetectionType.UNKNOWN, SafeSearchDetectionType racy = SafeSearchDetectionType.UNKNOWN, SafeSearchDetectionType spoof = SafeSearchDetectionType.UNKNOWN, SafeSearchDetectionType violence = SafeSearchDetectionType.UNKNOWN})
SafeSearchAnnotation.fromJson(Map<String, dynamic> json)
factory

Properties

adult SafeSearchDetectionType
Represents the adult content likelihood for the image. Adult content may contain elements such as nudity, pornographic images or cartoons, or sexual activities.
read / write
hashCode int
The hash code for this object.
read-onlyinherited
medical SafeSearchDetectionType
Likelihood that this is a medical image.
read / write
racy SafeSearchDetectionType
Likelihood that the request image contains racy content. Racy content may include (but is not limited to) skimpy or sheer clothing, strategically covered nudity, lewd or provocative poses, or close-ups of sensitive body areas.
read / write
runtimeType Type
A representation of the runtime type of the object.
read-onlyinherited
spoof SafeSearchDetectionType
Spoof likelihood. The likelihood that a modification was made to the image's canonical version to make it appear funny or offensive.
read / write
violence SafeSearchDetectionType
Likelihood that this image contains violent content.
read / write

Methods

noSuchMethod(Invocation invocation) → dynamic
Invoked when a non-existent method or property is accessed.
inherited
toJson() Map<String, dynamic>
toString() String
A string representation of this object.
override

Operators

operator ==(Object other) bool
The equality operator.
inherited