Class: Google::Apis::DataprocV1::Batch
- Inherits:
-
Object
- Object
- Google::Apis::DataprocV1::Batch
- Includes:
- Core::Hashable, Core::JsonObjectSupport
- Defined in:
- lib/google/apis/dataproc_v1/classes.rb,
lib/google/apis/dataproc_v1/representations.rb,
lib/google/apis/dataproc_v1/representations.rb
Overview
A representation of a batch workload in the service.
Instance Attribute Summary collapse
-
#create_time ⇒ String
Output only.
-
#creator ⇒ String
Output only.
-
#environment_config ⇒ Google::Apis::DataprocV1::EnvironmentConfig
Environment configuration for a workload.
-
#labels ⇒ Hash<String,String>
Optional.
-
#name ⇒ String
Output only.
-
#operation ⇒ String
Output only.
-
#pyspark_batch ⇒ Google::Apis::DataprocV1::PySparkBatch
A configuration for running an Apache PySpark (https://spark.apache.org/docs/ latest/api/python/getting_started/quickstart.html) batch workload.
-
#runtime_config ⇒ Google::Apis::DataprocV1::RuntimeConfig
Runtime configuration for a workload.
-
#runtime_info ⇒ Google::Apis::DataprocV1::RuntimeInfo
Runtime information about workload execution.
-
#spark_batch ⇒ Google::Apis::DataprocV1::SparkBatch
A configuration for running an Apache Spark (https://spark.apache.org/) batch workload.
-
#spark_r_batch ⇒ Google::Apis::DataprocV1::SparkRBatch
A configuration for running an Apache SparkR (https://spark.apache.org/docs/ latest/sparkr.html) batch workload.
-
#spark_sql_batch ⇒ Google::Apis::DataprocV1::SparkSqlBatch
A configuration for running Apache Spark SQL (https://spark.apache.org/sql/) queries as a batch workload.
-
#state ⇒ String
Output only.
-
#state_history ⇒ Array<Google::Apis::DataprocV1::StateHistory>
Output only.
-
#state_message ⇒ String
Output only.
-
#state_time ⇒ String
Output only.
-
#uuid ⇒ String
Output only.
Instance Method Summary collapse
-
#initialize(**args) ⇒ Batch
constructor
A new instance of Batch.
-
#update!(**args) ⇒ Object
Update properties of this object.
Constructor Details
#initialize(**args) ⇒ Batch
Returns a new instance of Batch.
1061 1062 1063 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1061 def initialize(**args) update!(**args) end |
Instance Attribute Details
#create_time ⇒ String
Output only. The time when the batch was created.
Corresponds to the JSON property createTime
969 970 971 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 969 def create_time @create_time end |
#creator ⇒ String
Output only. The email address of the user who created the batch.
Corresponds to the JSON property creator
974 975 976 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 974 def creator @creator end |
#environment_config ⇒ Google::Apis::DataprocV1::EnvironmentConfig
Environment configuration for a workload.
Corresponds to the JSON property environmentConfig
979 980 981 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 979 def environment_config @environment_config end |
#labels ⇒ Hash<String,String>
Optional. The labels to associate with this batch. Label keys must contain 1
to 63 characters, and must conform to RFC 1035 (https://www.ietf.org/rfc/
rfc1035.txt). Label values may be empty, but, if present, must contain 1 to 63
characters, and must conform to RFC 1035 (https://www.ietf.org/rfc/rfc1035.txt)
. No more than 32 labels can be associated with a batch.
Corresponds to the JSON property labels
988 989 990 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 988 def labels @labels end |
#name ⇒ String
Output only. The resource name of the batch.
Corresponds to the JSON property name
993 994 995 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 993 def name @name end |
#operation ⇒ String
Output only. The resource name of the operation associated with this batch.
Corresponds to the JSON property operation
998 999 1000 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 998 def operation @operation end |
#pyspark_batch ⇒ Google::Apis::DataprocV1::PySparkBatch
A configuration for running an Apache PySpark (https://spark.apache.org/docs/
latest/api/python/getting_started/quickstart.html) batch workload.
Corresponds to the JSON property pysparkBatch
1004 1005 1006 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1004 def pyspark_batch @pyspark_batch end |
#runtime_config ⇒ Google::Apis::DataprocV1::RuntimeConfig
Runtime configuration for a workload.
Corresponds to the JSON property runtimeConfig
1009 1010 1011 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1009 def runtime_config @runtime_config end |
#runtime_info ⇒ Google::Apis::DataprocV1::RuntimeInfo
Runtime information about workload execution.
Corresponds to the JSON property runtimeInfo
1014 1015 1016 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1014 def runtime_info @runtime_info end |
#spark_batch ⇒ Google::Apis::DataprocV1::SparkBatch
A configuration for running an Apache Spark (https://spark.apache.org/) batch
workload.
Corresponds to the JSON property sparkBatch
1020 1021 1022 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1020 def spark_batch @spark_batch end |
#spark_r_batch ⇒ Google::Apis::DataprocV1::SparkRBatch
A configuration for running an Apache SparkR (https://spark.apache.org/docs/
latest/sparkr.html) batch workload.
Corresponds to the JSON property sparkRBatch
1026 1027 1028 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1026 def spark_r_batch @spark_r_batch end |
#spark_sql_batch ⇒ Google::Apis::DataprocV1::SparkSqlBatch
A configuration for running Apache Spark SQL (https://spark.apache.org/sql/)
queries as a batch workload.
Corresponds to the JSON property sparkSqlBatch
1032 1033 1034 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1032 def spark_sql_batch @spark_sql_batch end |
#state ⇒ String
Output only. The state of the batch.
Corresponds to the JSON property state
1037 1038 1039 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1037 def state @state end |
#state_history ⇒ Array<Google::Apis::DataprocV1::StateHistory>
Output only. Historical state information for the batch.
Corresponds to the JSON property stateHistory
1042 1043 1044 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1042 def state_history @state_history end |
#state_message ⇒ String
Output only. Batch state details, such as a failure description if the state
is FAILED.
Corresponds to the JSON property stateMessage
1048 1049 1050 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1048 def @state_message end |
#state_time ⇒ String
Output only. The time when the batch entered a current state.
Corresponds to the JSON property stateTime
1053 1054 1055 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1053 def state_time @state_time end |
#uuid ⇒ String
Output only. A batch UUID (Unique Universal Identifier). The service generates
this value when it creates the batch.
Corresponds to the JSON property uuid
1059 1060 1061 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1059 def uuid @uuid end |
Instance Method Details
#update!(**args) ⇒ Object
Update properties of this object
1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1066 def update!(**args) @create_time = args[:create_time] if args.key?(:create_time) @creator = args[:creator] if args.key?(:creator) @environment_config = args[:environment_config] if args.key?(:environment_config) @labels = args[:labels] if args.key?(:labels) @name = args[:name] if args.key?(:name) @operation = args[:operation] if args.key?(:operation) @pyspark_batch = args[:pyspark_batch] if args.key?(:pyspark_batch) @runtime_config = args[:runtime_config] if args.key?(:runtime_config) @runtime_info = args[:runtime_info] if args.key?(:runtime_info) @spark_batch = args[:spark_batch] if args.key?(:spark_batch) @spark_r_batch = args[:spark_r_batch] if args.key?(:spark_r_batch) @spark_sql_batch = args[:spark_sql_batch] if args.key?(:spark_sql_batch) @state = args[:state] if args.key?(:state) @state_history = args[:state_history] if args.key?(:state_history) @state_message = args[:state_message] if args.key?(:state_message) @state_time = args[:state_time] if args.key?(:state_time) @uuid = args[:uuid] if args.key?(:uuid) end |