Class: Google::Apis::DataprocV1::Batch
- Inherits:
-
Object
- Object
- Google::Apis::DataprocV1::Batch
- Includes:
- Core::Hashable, Core::JsonObjectSupport
- Defined in:
- lib/google/apis/dataproc_v1/classes.rb,
lib/google/apis/dataproc_v1/representations.rb,
lib/google/apis/dataproc_v1/representations.rb
Overview
A representation of a batch workload in the service.
Instance Attribute Summary collapse
-
#create_time ⇒ String
Output only.
-
#creator ⇒ String
Output only.
-
#environment_config ⇒ Google::Apis::DataprocV1::EnvironmentConfig
Environment configuration for a workload.
-
#labels ⇒ Hash<String,String>
Optional.
-
#name ⇒ String
Output only.
-
#operation ⇒ String
Output only.
-
#pyspark_batch ⇒ Google::Apis::DataprocV1::PySparkBatch
A configuration for running an Apache PySpark (https://spark.apache.org/docs/ latest/api/python/getting_started/quickstart.html) batch workload.
-
#pyspark_notebook_batch ⇒ Google::Apis::DataprocV1::PySparkNotebookBatch
A configuration for running a PySpark Notebook batch workload.
-
#runtime_config ⇒ Google::Apis::DataprocV1::RuntimeConfig
Runtime configuration for a workload.
-
#runtime_info ⇒ Google::Apis::DataprocV1::RuntimeInfo
Runtime information about workload execution.
-
#spark_batch ⇒ Google::Apis::DataprocV1::SparkBatch
A configuration for running an Apache Spark (https://spark.apache.org/) batch workload.
-
#spark_r_batch ⇒ Google::Apis::DataprocV1::SparkRBatch
A configuration for running an Apache SparkR (https://spark.apache.org/docs/ latest/sparkr.html) batch workload.
-
#spark_sql_batch ⇒ Google::Apis::DataprocV1::SparkSqlBatch
A configuration for running Apache Spark SQL (https://spark.apache.org/sql/) queries as a batch workload.
-
#state ⇒ String
Output only.
-
#state_history ⇒ Array<Google::Apis::DataprocV1::StateHistory>
Output only.
-
#state_message ⇒ String
Output only.
-
#state_time ⇒ String
Output only.
-
#uuid ⇒ String
Output only.
Instance Method Summary collapse
-
#initialize(**args) ⇒ Batch
constructor
A new instance of Batch.
-
#update!(**args) ⇒ Object
Update properties of this object.
Constructor Details
#initialize(**args) ⇒ Batch
Returns a new instance of Batch.
1107 1108 1109 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1107 def initialize(**args) update!(**args) end |
Instance Attribute Details
#create_time ⇒ String
Output only. The time when the batch was created.
Corresponds to the JSON property createTime
1010 1011 1012 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1010 def create_time @create_time end |
#creator ⇒ String
Output only. The email address of the user who created the batch.
Corresponds to the JSON property creator
1015 1016 1017 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1015 def creator @creator end |
#environment_config ⇒ Google::Apis::DataprocV1::EnvironmentConfig
Environment configuration for a workload.
Corresponds to the JSON property environmentConfig
1020 1021 1022 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1020 def environment_config @environment_config end |
#labels ⇒ Hash<String,String>
Optional. The labels to associate with this batch. Label keys must contain 1
to 63 characters, and must conform to RFC 1035 (https://www.ietf.org/rfc/
rfc1035.txt). Label values may be empty, but, if present, must contain 1 to 63
characters, and must conform to RFC 1035 (https://www.ietf.org/rfc/rfc1035.txt)
. No more than 32 labels can be associated with a batch.
Corresponds to the JSON property labels
1029 1030 1031 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1029 def labels @labels end |
#name ⇒ String
Output only. The resource name of the batch.
Corresponds to the JSON property name
1034 1035 1036 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1034 def name @name end |
#operation ⇒ String
Output only. The resource name of the operation associated with this batch.
Corresponds to the JSON property operation
1039 1040 1041 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1039 def operation @operation end |
#pyspark_batch ⇒ Google::Apis::DataprocV1::PySparkBatch
A configuration for running an Apache PySpark (https://spark.apache.org/docs/
latest/api/python/getting_started/quickstart.html) batch workload.
Corresponds to the JSON property pysparkBatch
1045 1046 1047 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1045 def pyspark_batch @pyspark_batch end |
#pyspark_notebook_batch ⇒ Google::Apis::DataprocV1::PySparkNotebookBatch
A configuration for running a PySpark Notebook batch workload.
Corresponds to the JSON property pysparkNotebookBatch
1050 1051 1052 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1050 def pyspark_notebook_batch @pyspark_notebook_batch end |
#runtime_config ⇒ Google::Apis::DataprocV1::RuntimeConfig
Runtime configuration for a workload.
Corresponds to the JSON property runtimeConfig
1055 1056 1057 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1055 def runtime_config @runtime_config end |
#runtime_info ⇒ Google::Apis::DataprocV1::RuntimeInfo
Runtime information about workload execution.
Corresponds to the JSON property runtimeInfo
1060 1061 1062 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1060 def runtime_info @runtime_info end |
#spark_batch ⇒ Google::Apis::DataprocV1::SparkBatch
A configuration for running an Apache Spark (https://spark.apache.org/) batch
workload.
Corresponds to the JSON property sparkBatch
1066 1067 1068 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1066 def spark_batch @spark_batch end |
#spark_r_batch ⇒ Google::Apis::DataprocV1::SparkRBatch
A configuration for running an Apache SparkR (https://spark.apache.org/docs/
latest/sparkr.html) batch workload.
Corresponds to the JSON property sparkRBatch
1072 1073 1074 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1072 def spark_r_batch @spark_r_batch end |
#spark_sql_batch ⇒ Google::Apis::DataprocV1::SparkSqlBatch
A configuration for running Apache Spark SQL (https://spark.apache.org/sql/)
queries as a batch workload.
Corresponds to the JSON property sparkSqlBatch
1078 1079 1080 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1078 def spark_sql_batch @spark_sql_batch end |
#state ⇒ String
Output only. The state of the batch.
Corresponds to the JSON property state
1083 1084 1085 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1083 def state @state end |
#state_history ⇒ Array<Google::Apis::DataprocV1::StateHistory>
Output only. Historical state information for the batch.
Corresponds to the JSON property stateHistory
1088 1089 1090 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1088 def state_history @state_history end |
#state_message ⇒ String
Output only. Batch state details, such as a failure description if the state
is FAILED.
Corresponds to the JSON property stateMessage
1094 1095 1096 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1094 def @state_message end |
#state_time ⇒ String
Output only. The time when the batch entered a current state.
Corresponds to the JSON property stateTime
1099 1100 1101 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1099 def state_time @state_time end |
#uuid ⇒ String
Output only. A batch UUID (Unique Universal Identifier). The service generates
this value when it creates the batch.
Corresponds to the JSON property uuid
1105 1106 1107 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1105 def uuid @uuid end |
Instance Method Details
#update!(**args) ⇒ Object
Update properties of this object
1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1112 def update!(**args) @create_time = args[:create_time] if args.key?(:create_time) @creator = args[:creator] if args.key?(:creator) @environment_config = args[:environment_config] if args.key?(:environment_config) @labels = args[:labels] if args.key?(:labels) @name = args[:name] if args.key?(:name) @operation = args[:operation] if args.key?(:operation) @pyspark_batch = args[:pyspark_batch] if args.key?(:pyspark_batch) @pyspark_notebook_batch = args[:pyspark_notebook_batch] if args.key?(:pyspark_notebook_batch) @runtime_config = args[:runtime_config] if args.key?(:runtime_config) @runtime_info = args[:runtime_info] if args.key?(:runtime_info) @spark_batch = args[:spark_batch] if args.key?(:spark_batch) @spark_r_batch = args[:spark_r_batch] if args.key?(:spark_r_batch) @spark_sql_batch = args[:spark_sql_batch] if args.key?(:spark_sql_batch) @state = args[:state] if args.key?(:state) @state_history = args[:state_history] if args.key?(:state_history) @state_message = args[:state_message] if args.key?(:state_message) @state_time = args[:state_time] if args.key?(:state_time) @uuid = args[:uuid] if args.key?(:uuid) end |