AWS Glue

2020/02/12 - AWS Glue - 5 updated api methods

Changes  Update glue client to latest version

BatchGetJobs (updated) Link ¶
Changes (response)
{'Jobs': {'NonOverridableArguments': {'string': 'string'}}}

Returns a list of resource metadata for a given list of job names. After calling the ListJobs operation, you can call this operation to access the data to which you have been granted permissions. This operation supports all IAM permissions, including permission conditions that uses tags.

See also: AWS API Documentation

Request Syntax

client.batch_get_jobs(
    JobNames=[
        'string',
    ]
)
type JobNames:

list

param JobNames:

[REQUIRED]

A list of job names, which might be the names returned from the ListJobs operation.

  • (string) --

rtype:

dict

returns:

Response Syntax

{
    'Jobs': [
        {
            'Name': 'string',
            'Description': 'string',
            'LogUri': 'string',
            'Role': 'string',
            'CreatedOn': datetime(2015, 1, 1),
            'LastModifiedOn': datetime(2015, 1, 1),
            'ExecutionProperty': {
                'MaxConcurrentRuns': 123
            },
            'Command': {
                'Name': 'string',
                'ScriptLocation': 'string',
                'PythonVersion': 'string'
            },
            'DefaultArguments': {
                'string': 'string'
            },
            'NonOverridableArguments': {
                'string': 'string'
            },
            'Connections': {
                'Connections': [
                    'string',
                ]
            },
            'MaxRetries': 123,
            'AllocatedCapacity': 123,
            'Timeout': 123,
            'MaxCapacity': 123.0,
            'WorkerType': 'Standard'|'G.1X'|'G.2X',
            'NumberOfWorkers': 123,
            'SecurityConfiguration': 'string',
            'NotificationProperty': {
                'NotifyDelayAfter': 123
            },
            'GlueVersion': 'string'
        },
    ],
    'JobsNotFound': [
        'string',
    ]
}

Response Structure

  • (dict) --

    • Jobs (list) --

      A list of job definitions.

      • (dict) --

        Specifies a job definition.

        • Name (string) --

          The name you assign to this job definition.

        • Description (string) --

          A description of the job.

        • LogUri (string) --

          This field is reserved for future use.

        • Role (string) --

          The name or Amazon Resource Name (ARN) of the IAM role associated with this job.

        • CreatedOn (datetime) --

          The time and date that this job definition was created.

        • LastModifiedOn (datetime) --

          The last point in time when this job definition was modified.

        • ExecutionProperty (dict) --

          An ExecutionProperty specifying the maximum number of concurrent runs allowed for this job.

          • MaxConcurrentRuns (integer) --

            The maximum number of concurrent runs allowed for the job. The default is 1. An error is returned when this threshold is reached. The maximum value you can specify is controlled by a service limit.

        • Command (dict) --

          The JobCommand that executes this job.

          • Name (string) --

            The name of the job command. For an Apache Spark ETL job, this must be glueetl. For a Python shell job, it must be pythonshell.

          • ScriptLocation (string) --

            Specifies the Amazon Simple Storage Service (Amazon S3) path to a script that executes a job.

          • PythonVersion (string) --

            The Python version being used to execute a Python shell job. Allowed values are 2 or 3.

        • DefaultArguments (dict) --

          The default arguments for this job, specified as name-value pairs.

          You can specify arguments here that your own job-execution script consumes, as well as arguments that AWS Glue itself consumes.

          For information about how to specify and consume your own Job arguments, see the Calling AWS Glue APIs in Python topic in the developer guide.

          For information about the key-value pairs that AWS Glue consumes to set up your job, see the Special Parameters Used by AWS Glue topic in the developer guide.

          • (string) --

            • (string) --

        • NonOverridableArguments (dict) --

          Non-overridable arguments for this job, specified as name-value pairs.

          • (string) --

            • (string) --

        • Connections (dict) --

          The connections used for this job.

          • Connections (list) --

            A list of connections used by the job.

            • (string) --

        • MaxRetries (integer) --

          The maximum number of times to retry this job after a JobRun fails.

        • AllocatedCapacity (integer) --

          This field is deprecated. Use MaxCapacity instead.

          The number of AWS Glue data processing units (DPUs) allocated to runs of this job. You can allocate from 2 to 100 DPUs; the default is 10. A DPU is a relative measure of processing power that consists of 4 vCPUs of compute capacity and 16 GB of memory. For more information, see the AWS Glue pricing page.

        • Timeout (integer) --

          The job timeout in minutes. This is the maximum time that a job run can consume resources before it is terminated and enters TIMEOUT status. The default is 2,880 minutes (48 hours).

        • MaxCapacity (float) --

          The number of AWS Glue data processing units (DPUs) that can be allocated when this job runs. A DPU is a relative measure of processing power that consists of 4 vCPUs of compute capacity and 16 GB of memory. For more information, see the AWS Glue pricing page.

          Do not set Max Capacity if using WorkerType and NumberOfWorkers.

          The value that can be allocated for MaxCapacity depends on whether you are running a Python shell job or an Apache Spark ETL job:

          • When you specify a Python shell job ( ``JobCommand.Name``="pythonshell"), you can allocate either 0.0625 or 1 DPU. The default is 0.0625 DPU.

          • When you specify an Apache Spark ETL job ( ``JobCommand.Name``="glueetl"), you can allocate from 2 to 100 DPUs. The default is 10 DPUs. This job type cannot have a fractional DPU allocation.

        • WorkerType (string) --

          The type of predefined worker that is allocated when a job runs. Accepts a value of Standard, G.1X, or G.2X.

          • For the Standard worker type, each worker provides 4 vCPU, 16 GB of memory and a 50GB disk, and 2 executors per worker.

          • For the G.1X worker type, each worker maps to 1 DPU (4 vCPU, 16 GB of memory, 64 GB disk), and provides 1 executor per worker. We recommend this worker type for memory-intensive jobs.

          • For the G.2X worker type, each worker maps to 2 DPU (8 vCPU, 32 GB of memory, 128 GB disk), and provides 1 executor per worker. We recommend this worker type for memory-intensive jobs.

        • NumberOfWorkers (integer) --

          The number of workers of a defined workerType that are allocated when a job runs.

          The maximum number of workers you can define are 299 for G.1X, and 149 for G.2X.

        • SecurityConfiguration (string) --

          The name of the SecurityConfiguration structure to be used with this job.

        • NotificationProperty (dict) --

          Specifies configuration properties of a job notification.

          • NotifyDelayAfter (integer) --

            After a job run starts, the number of minutes to wait before sending a job run delay notification.

        • GlueVersion (string) --

          Glue version determines the versions of Apache Spark and Python that AWS Glue supports. The Python version indicates the version supported for jobs of type Spark.

          For more information about the available AWS Glue versions and corresponding Spark and Python versions, see Glue version in the developer guide.

          Jobs that are created without specifying a Glue version default to Glue 0.9.

    • JobsNotFound (list) --

      A list of names of jobs not found.

      • (string) --

CreateJob (updated) Link ¶
Changes (request)
{'NonOverridableArguments': {'string': 'string'}}

Creates a new job definition.

See also: AWS API Documentation

Request Syntax

client.create_job(
    Name='string',
    Description='string',
    LogUri='string',
    Role='string',
    ExecutionProperty={
        'MaxConcurrentRuns': 123
    },
    Command={
        'Name': 'string',
        'ScriptLocation': 'string',
        'PythonVersion': 'string'
    },
    DefaultArguments={
        'string': 'string'
    },
    NonOverridableArguments={
        'string': 'string'
    },
    Connections={
        'Connections': [
            'string',
        ]
    },
    MaxRetries=123,
    AllocatedCapacity=123,
    Timeout=123,
    MaxCapacity=123.0,
    SecurityConfiguration='string',
    Tags={
        'string': 'string'
    },
    NotificationProperty={
        'NotifyDelayAfter': 123
    },
    GlueVersion='string',
    NumberOfWorkers=123,
    WorkerType='Standard'|'G.1X'|'G.2X'
)
type Name:

string

param Name:

[REQUIRED]

The name you assign to this job definition. It must be unique in your account.

type Description:

string

param Description:

Description of the job being defined.

type LogUri:

string

param LogUri:

This field is reserved for future use.

type Role:

string

param Role:

[REQUIRED]

The name or Amazon Resource Name (ARN) of the IAM role associated with this job.

type ExecutionProperty:

dict

param ExecutionProperty:

An ExecutionProperty specifying the maximum number of concurrent runs allowed for this job.

  • MaxConcurrentRuns (integer) --

    The maximum number of concurrent runs allowed for the job. The default is 1. An error is returned when this threshold is reached. The maximum value you can specify is controlled by a service limit.

type Command:

dict

param Command:

[REQUIRED]

The JobCommand that executes this job.

  • Name (string) --

    The name of the job command. For an Apache Spark ETL job, this must be glueetl. For a Python shell job, it must be pythonshell.

  • ScriptLocation (string) --

    Specifies the Amazon Simple Storage Service (Amazon S3) path to a script that executes a job.

  • PythonVersion (string) --

    The Python version being used to execute a Python shell job. Allowed values are 2 or 3.

type DefaultArguments:

dict

param DefaultArguments:

The default arguments for this job.

You can specify arguments here that your own job-execution script consumes, as well as arguments that AWS Glue itself consumes.

For information about how to specify and consume your own Job arguments, see the Calling AWS Glue APIs in Python topic in the developer guide.

For information about the key-value pairs that AWS Glue consumes to set up your job, see the Special Parameters Used by AWS Glue topic in the developer guide.

  • (string) --

    • (string) --

type NonOverridableArguments:

dict

param NonOverridableArguments:

Non-overridable arguments for this job, specified as name-value pairs.

  • (string) --

    • (string) --

type Connections:

dict

param Connections:

The connections used for this job.

  • Connections (list) --

    A list of connections used by the job.

    • (string) --

type MaxRetries:

integer

param MaxRetries:

The maximum number of times to retry this job if it fails.

type AllocatedCapacity:

integer

param AllocatedCapacity:

This parameter is deprecated. Use MaxCapacity instead.

The number of AWS Glue data processing units (DPUs) to allocate to this Job. You can allocate from 2 to 100 DPUs; the default is 10. A DPU is a relative measure of processing power that consists of 4 vCPUs of compute capacity and 16 GB of memory. For more information, see the AWS Glue pricing page.

type Timeout:

integer

param Timeout:

The job timeout in minutes. This is the maximum time that a job run can consume resources before it is terminated and enters TIMEOUT status. The default is 2,880 minutes (48 hours).

type MaxCapacity:

float

param MaxCapacity:

The number of AWS Glue data processing units (DPUs) that can be allocated when this job runs. A DPU is a relative measure of processing power that consists of 4 vCPUs of compute capacity and 16 GB of memory. For more information, see the AWS Glue pricing page.

Do not set Max Capacity if using WorkerType and NumberOfWorkers.

The value that can be allocated for MaxCapacity depends on whether you are running a Python shell job or an Apache Spark ETL job:

  • When you specify a Python shell job ( ``JobCommand.Name``="pythonshell"), you can allocate either 0.0625 or 1 DPU. The default is 0.0625 DPU.

  • When you specify an Apache Spark ETL job ( ``JobCommand.Name``="glueetl"), you can allocate from 2 to 100 DPUs. The default is 10 DPUs. This job type cannot have a fractional DPU allocation.

type SecurityConfiguration:

string

param SecurityConfiguration:

The name of the SecurityConfiguration structure to be used with this job.

type Tags:

dict

param Tags:

The tags to use with this job. You may use tags to limit access to the job. For more information about tags in AWS Glue, see AWS Tags in AWS Glue in the developer guide.

  • (string) --

    • (string) --

type NotificationProperty:

dict

param NotificationProperty:

Specifies configuration properties of a job notification.

  • NotifyDelayAfter (integer) --

    After a job run starts, the number of minutes to wait before sending a job run delay notification.

type GlueVersion:

string

param GlueVersion:

Glue version determines the versions of Apache Spark and Python that AWS Glue supports. The Python version indicates the version supported for jobs of type Spark.

For more information about the available AWS Glue versions and corresponding Spark and Python versions, see Glue version in the developer guide.

Jobs that are created without specifying a Glue version default to Glue 0.9.

type NumberOfWorkers:

integer

param NumberOfWorkers:

The number of workers of a defined workerType that are allocated when a job runs.

The maximum number of workers you can define are 299 for G.1X, and 149 for G.2X.

type WorkerType:

string

param WorkerType:

The type of predefined worker that is allocated when a job runs. Accepts a value of Standard, G.1X, or G.2X.

  • For the Standard worker type, each worker provides 4 vCPU, 16 GB of memory and a 50GB disk, and 2 executors per worker.

  • For the G.1X worker type, each worker maps to 1 DPU (4 vCPU, 16 GB of memory, 64 GB disk), and provides 1 executor per worker. We recommend this worker type for memory-intensive jobs.

  • For the G.2X worker type, each worker maps to 2 DPU (8 vCPU, 32 GB of memory, 128 GB disk), and provides 1 executor per worker. We recommend this worker type for memory-intensive jobs.

rtype:

dict

returns:

Response Syntax

{
    'Name': 'string'
}

Response Structure

  • (dict) --

    • Name (string) --

      The unique name that was provided for this job definition.

GetJob (updated) Link ¶
Changes (response)
{'Job': {'NonOverridableArguments': {'string': 'string'}}}

Retrieves an existing job definition.

See also: AWS API Documentation

Request Syntax

client.get_job(
    JobName='string'
)
type JobName:

string

param JobName:

[REQUIRED]

The name of the job definition to retrieve.

rtype:

dict

returns:

Response Syntax

{
    'Job': {
        'Name': 'string',
        'Description': 'string',
        'LogUri': 'string',
        'Role': 'string',
        'CreatedOn': datetime(2015, 1, 1),
        'LastModifiedOn': datetime(2015, 1, 1),
        'ExecutionProperty': {
            'MaxConcurrentRuns': 123
        },
        'Command': {
            'Name': 'string',
            'ScriptLocation': 'string',
            'PythonVersion': 'string'
        },
        'DefaultArguments': {
            'string': 'string'
        },
        'NonOverridableArguments': {
            'string': 'string'
        },
        'Connections': {
            'Connections': [
                'string',
            ]
        },
        'MaxRetries': 123,
        'AllocatedCapacity': 123,
        'Timeout': 123,
        'MaxCapacity': 123.0,
        'WorkerType': 'Standard'|'G.1X'|'G.2X',
        'NumberOfWorkers': 123,
        'SecurityConfiguration': 'string',
        'NotificationProperty': {
            'NotifyDelayAfter': 123
        },
        'GlueVersion': 'string'
    }
}

Response Structure

  • (dict) --

    • Job (dict) --

      The requested job definition.

      • Name (string) --

        The name you assign to this job definition.

      • Description (string) --

        A description of the job.

      • LogUri (string) --

        This field is reserved for future use.

      • Role (string) --

        The name or Amazon Resource Name (ARN) of the IAM role associated with this job.

      • CreatedOn (datetime) --

        The time and date that this job definition was created.

      • LastModifiedOn (datetime) --

        The last point in time when this job definition was modified.

      • ExecutionProperty (dict) --

        An ExecutionProperty specifying the maximum number of concurrent runs allowed for this job.

        • MaxConcurrentRuns (integer) --

          The maximum number of concurrent runs allowed for the job. The default is 1. An error is returned when this threshold is reached. The maximum value you can specify is controlled by a service limit.

      • Command (dict) --

        The JobCommand that executes this job.

        • Name (string) --

          The name of the job command. For an Apache Spark ETL job, this must be glueetl. For a Python shell job, it must be pythonshell.

        • ScriptLocation (string) --

          Specifies the Amazon Simple Storage Service (Amazon S3) path to a script that executes a job.

        • PythonVersion (string) --

          The Python version being used to execute a Python shell job. Allowed values are 2 or 3.

      • DefaultArguments (dict) --

        The default arguments for this job, specified as name-value pairs.

        You can specify arguments here that your own job-execution script consumes, as well as arguments that AWS Glue itself consumes.

        For information about how to specify and consume your own Job arguments, see the Calling AWS Glue APIs in Python topic in the developer guide.

        For information about the key-value pairs that AWS Glue consumes to set up your job, see the Special Parameters Used by AWS Glue topic in the developer guide.

        • (string) --

          • (string) --

      • NonOverridableArguments (dict) --

        Non-overridable arguments for this job, specified as name-value pairs.

        • (string) --

          • (string) --

      • Connections (dict) --

        The connections used for this job.

        • Connections (list) --

          A list of connections used by the job.

          • (string) --

      • MaxRetries (integer) --

        The maximum number of times to retry this job after a JobRun fails.

      • AllocatedCapacity (integer) --

        This field is deprecated. Use MaxCapacity instead.

        The number of AWS Glue data processing units (DPUs) allocated to runs of this job. You can allocate from 2 to 100 DPUs; the default is 10. A DPU is a relative measure of processing power that consists of 4 vCPUs of compute capacity and 16 GB of memory. For more information, see the AWS Glue pricing page.

      • Timeout (integer) --

        The job timeout in minutes. This is the maximum time that a job run can consume resources before it is terminated and enters TIMEOUT status. The default is 2,880 minutes (48 hours).

      • MaxCapacity (float) --

        The number of AWS Glue data processing units (DPUs) that can be allocated when this job runs. A DPU is a relative measure of processing power that consists of 4 vCPUs of compute capacity and 16 GB of memory. For more information, see the AWS Glue pricing page.

        Do not set Max Capacity if using WorkerType and NumberOfWorkers.

        The value that can be allocated for MaxCapacity depends on whether you are running a Python shell job or an Apache Spark ETL job:

        • When you specify a Python shell job ( ``JobCommand.Name``="pythonshell"), you can allocate either 0.0625 or 1 DPU. The default is 0.0625 DPU.

        • When you specify an Apache Spark ETL job ( ``JobCommand.Name``="glueetl"), you can allocate from 2 to 100 DPUs. The default is 10 DPUs. This job type cannot have a fractional DPU allocation.

      • WorkerType (string) --

        The type of predefined worker that is allocated when a job runs. Accepts a value of Standard, G.1X, or G.2X.

        • For the Standard worker type, each worker provides 4 vCPU, 16 GB of memory and a 50GB disk, and 2 executors per worker.

        • For the G.1X worker type, each worker maps to 1 DPU (4 vCPU, 16 GB of memory, 64 GB disk), and provides 1 executor per worker. We recommend this worker type for memory-intensive jobs.

        • For the G.2X worker type, each worker maps to 2 DPU (8 vCPU, 32 GB of memory, 128 GB disk), and provides 1 executor per worker. We recommend this worker type for memory-intensive jobs.

      • NumberOfWorkers (integer) --

        The number of workers of a defined workerType that are allocated when a job runs.

        The maximum number of workers you can define are 299 for G.1X, and 149 for G.2X.

      • SecurityConfiguration (string) --

        The name of the SecurityConfiguration structure to be used with this job.

      • NotificationProperty (dict) --

        Specifies configuration properties of a job notification.

        • NotifyDelayAfter (integer) --

          After a job run starts, the number of minutes to wait before sending a job run delay notification.

      • GlueVersion (string) --

        Glue version determines the versions of Apache Spark and Python that AWS Glue supports. The Python version indicates the version supported for jobs of type Spark.

        For more information about the available AWS Glue versions and corresponding Spark and Python versions, see Glue version in the developer guide.

        Jobs that are created without specifying a Glue version default to Glue 0.9.

GetJobs (updated) Link ¶
Changes (response)
{'Jobs': {'NonOverridableArguments': {'string': 'string'}}}

Retrieves all current job definitions.

See also: AWS API Documentation

Request Syntax

client.get_jobs(
    NextToken='string',
    MaxResults=123
)
type NextToken:

string

param NextToken:

A continuation token, if this is a continuation call.

type MaxResults:

integer

param MaxResults:

The maximum size of the response.

rtype:

dict

returns:

Response Syntax

{
    'Jobs': [
        {
            'Name': 'string',
            'Description': 'string',
            'LogUri': 'string',
            'Role': 'string',
            'CreatedOn': datetime(2015, 1, 1),
            'LastModifiedOn': datetime(2015, 1, 1),
            'ExecutionProperty': {
                'MaxConcurrentRuns': 123
            },
            'Command': {
                'Name': 'string',
                'ScriptLocation': 'string',
                'PythonVersion': 'string'
            },
            'DefaultArguments': {
                'string': 'string'
            },
            'NonOverridableArguments': {
                'string': 'string'
            },
            'Connections': {
                'Connections': [
                    'string',
                ]
            },
            'MaxRetries': 123,
            'AllocatedCapacity': 123,
            'Timeout': 123,
            'MaxCapacity': 123.0,
            'WorkerType': 'Standard'|'G.1X'|'G.2X',
            'NumberOfWorkers': 123,
            'SecurityConfiguration': 'string',
            'NotificationProperty': {
                'NotifyDelayAfter': 123
            },
            'GlueVersion': 'string'
        },
    ],
    'NextToken': 'string'
}

Response Structure

  • (dict) --

    • Jobs (list) --

      A list of job definitions.

      • (dict) --

        Specifies a job definition.

        • Name (string) --

          The name you assign to this job definition.

        • Description (string) --

          A description of the job.

        • LogUri (string) --

          This field is reserved for future use.

        • Role (string) --

          The name or Amazon Resource Name (ARN) of the IAM role associated with this job.

        • CreatedOn (datetime) --

          The time and date that this job definition was created.

        • LastModifiedOn (datetime) --

          The last point in time when this job definition was modified.

        • ExecutionProperty (dict) --

          An ExecutionProperty specifying the maximum number of concurrent runs allowed for this job.

          • MaxConcurrentRuns (integer) --

            The maximum number of concurrent runs allowed for the job. The default is 1. An error is returned when this threshold is reached. The maximum value you can specify is controlled by a service limit.

        • Command (dict) --

          The JobCommand that executes this job.

          • Name (string) --

            The name of the job command. For an Apache Spark ETL job, this must be glueetl. For a Python shell job, it must be pythonshell.

          • ScriptLocation (string) --

            Specifies the Amazon Simple Storage Service (Amazon S3) path to a script that executes a job.

          • PythonVersion (string) --

            The Python version being used to execute a Python shell job. Allowed values are 2 or 3.

        • DefaultArguments (dict) --

          The default arguments for this job, specified as name-value pairs.

          You can specify arguments here that your own job-execution script consumes, as well as arguments that AWS Glue itself consumes.

          For information about how to specify and consume your own Job arguments, see the Calling AWS Glue APIs in Python topic in the developer guide.

          For information about the key-value pairs that AWS Glue consumes to set up your job, see the Special Parameters Used by AWS Glue topic in the developer guide.

          • (string) --

            • (string) --

        • NonOverridableArguments (dict) --

          Non-overridable arguments for this job, specified as name-value pairs.

          • (string) --

            • (string) --

        • Connections (dict) --

          The connections used for this job.

          • Connections (list) --

            A list of connections used by the job.

            • (string) --

        • MaxRetries (integer) --

          The maximum number of times to retry this job after a JobRun fails.

        • AllocatedCapacity (integer) --

          This field is deprecated. Use MaxCapacity instead.

          The number of AWS Glue data processing units (DPUs) allocated to runs of this job. You can allocate from 2 to 100 DPUs; the default is 10. A DPU is a relative measure of processing power that consists of 4 vCPUs of compute capacity and 16 GB of memory. For more information, see the AWS Glue pricing page.

        • Timeout (integer) --

          The job timeout in minutes. This is the maximum time that a job run can consume resources before it is terminated and enters TIMEOUT status. The default is 2,880 minutes (48 hours).

        • MaxCapacity (float) --

          The number of AWS Glue data processing units (DPUs) that can be allocated when this job runs. A DPU is a relative measure of processing power that consists of 4 vCPUs of compute capacity and 16 GB of memory. For more information, see the AWS Glue pricing page.

          Do not set Max Capacity if using WorkerType and NumberOfWorkers.

          The value that can be allocated for MaxCapacity depends on whether you are running a Python shell job or an Apache Spark ETL job:

          • When you specify a Python shell job ( ``JobCommand.Name``="pythonshell"), you can allocate either 0.0625 or 1 DPU. The default is 0.0625 DPU.

          • When you specify an Apache Spark ETL job ( ``JobCommand.Name``="glueetl"), you can allocate from 2 to 100 DPUs. The default is 10 DPUs. This job type cannot have a fractional DPU allocation.

        • WorkerType (string) --

          The type of predefined worker that is allocated when a job runs. Accepts a value of Standard, G.1X, or G.2X.

          • For the Standard worker type, each worker provides 4 vCPU, 16 GB of memory and a 50GB disk, and 2 executors per worker.

          • For the G.1X worker type, each worker maps to 1 DPU (4 vCPU, 16 GB of memory, 64 GB disk), and provides 1 executor per worker. We recommend this worker type for memory-intensive jobs.

          • For the G.2X worker type, each worker maps to 2 DPU (8 vCPU, 32 GB of memory, 128 GB disk), and provides 1 executor per worker. We recommend this worker type for memory-intensive jobs.

        • NumberOfWorkers (integer) --

          The number of workers of a defined workerType that are allocated when a job runs.

          The maximum number of workers you can define are 299 for G.1X, and 149 for G.2X.

        • SecurityConfiguration (string) --

          The name of the SecurityConfiguration structure to be used with this job.

        • NotificationProperty (dict) --

          Specifies configuration properties of a job notification.

          • NotifyDelayAfter (integer) --

            After a job run starts, the number of minutes to wait before sending a job run delay notification.

        • GlueVersion (string) --

          Glue version determines the versions of Apache Spark and Python that AWS Glue supports. The Python version indicates the version supported for jobs of type Spark.

          For more information about the available AWS Glue versions and corresponding Spark and Python versions, see Glue version in the developer guide.

          Jobs that are created without specifying a Glue version default to Glue 0.9.

    • NextToken (string) --

      A continuation token, if not all job definitions have yet been returned.

UpdateJob (updated) Link ¶
Changes (request)
{'JobUpdate': {'NonOverridableArguments': {'string': 'string'}}}

Updates an existing job definition.

See also: AWS API Documentation

Request Syntax

client.update_job(
    JobName='string',
    JobUpdate={
        'Description': 'string',
        'LogUri': 'string',
        'Role': 'string',
        'ExecutionProperty': {
            'MaxConcurrentRuns': 123
        },
        'Command': {
            'Name': 'string',
            'ScriptLocation': 'string',
            'PythonVersion': 'string'
        },
        'DefaultArguments': {
            'string': 'string'
        },
        'NonOverridableArguments': {
            'string': 'string'
        },
        'Connections': {
            'Connections': [
                'string',
            ]
        },
        'MaxRetries': 123,
        'AllocatedCapacity': 123,
        'Timeout': 123,
        'MaxCapacity': 123.0,
        'WorkerType': 'Standard'|'G.1X'|'G.2X',
        'NumberOfWorkers': 123,
        'SecurityConfiguration': 'string',
        'NotificationProperty': {
            'NotifyDelayAfter': 123
        },
        'GlueVersion': 'string'
    }
)
type JobName:

string

param JobName:

[REQUIRED]

The name of the job definition to update.

type JobUpdate:

dict

param JobUpdate:

[REQUIRED]

Specifies the values with which to update the job definition.

  • Description (string) --

    Description of the job being defined.

  • LogUri (string) --

    This field is reserved for future use.

  • Role (string) --

    The name or Amazon Resource Name (ARN) of the IAM role associated with this job (required).

  • ExecutionProperty (dict) --

    An ExecutionProperty specifying the maximum number of concurrent runs allowed for this job.

    • MaxConcurrentRuns (integer) --

      The maximum number of concurrent runs allowed for the job. The default is 1. An error is returned when this threshold is reached. The maximum value you can specify is controlled by a service limit.

  • Command (dict) --

    The JobCommand that executes this job (required).

    • Name (string) --

      The name of the job command. For an Apache Spark ETL job, this must be glueetl. For a Python shell job, it must be pythonshell.

    • ScriptLocation (string) --

      Specifies the Amazon Simple Storage Service (Amazon S3) path to a script that executes a job.

    • PythonVersion (string) --

      The Python version being used to execute a Python shell job. Allowed values are 2 or 3.

  • DefaultArguments (dict) --

    The default arguments for this job.

    You can specify arguments here that your own job-execution script consumes, as well as arguments that AWS Glue itself consumes.

    For information about how to specify and consume your own Job arguments, see the Calling AWS Glue APIs in Python topic in the developer guide.

    For information about the key-value pairs that AWS Glue consumes to set up your job, see the Special Parameters Used by AWS Glue topic in the developer guide.

    • (string) --

      • (string) --

  • NonOverridableArguments (dict) --

    Non-overridable arguments for this job, specified as name-value pairs.

    • (string) --

      • (string) --

  • Connections (dict) --

    The connections used for this job.

    • Connections (list) --

      A list of connections used by the job.

      • (string) --

  • MaxRetries (integer) --

    The maximum number of times to retry this job if it fails.

  • AllocatedCapacity (integer) --

    This field is deprecated. Use MaxCapacity instead.

    The number of AWS Glue data processing units (DPUs) to allocate to this job. You can allocate from 2 to 100 DPUs; the default is 10. A DPU is a relative measure of processing power that consists of 4 vCPUs of compute capacity and 16 GB of memory. For more information, see the AWS Glue pricing page.

  • Timeout (integer) --

    The job timeout in minutes. This is the maximum time that a job run can consume resources before it is terminated and enters TIMEOUT status. The default is 2,880 minutes (48 hours).

  • MaxCapacity (float) --

    The number of AWS Glue data processing units (DPUs) that can be allocated when this job runs. A DPU is a relative measure of processing power that consists of 4 vCPUs of compute capacity and 16 GB of memory. For more information, see the AWS Glue pricing page.

    Do not set Max Capacity if using WorkerType and NumberOfWorkers.

    The value that can be allocated for MaxCapacity depends on whether you are running a Python shell job or an Apache Spark ETL job:

    • When you specify a Python shell job ( ``JobCommand.Name``="pythonshell"), you can allocate either 0.0625 or 1 DPU. The default is 0.0625 DPU.

    • When you specify an Apache Spark ETL job ( ``JobCommand.Name``="glueetl"), you can allocate from 2 to 100 DPUs. The default is 10 DPUs. This job type cannot have a fractional DPU allocation.

  • WorkerType (string) --

    The type of predefined worker that is allocated when a job runs. Accepts a value of Standard, G.1X, or G.2X.

    • For the Standard worker type, each worker provides 4 vCPU, 16 GB of memory and a 50GB disk, and 2 executors per worker.

    • For the G.1X worker type, each worker maps to 1 DPU (4 vCPU, 16 GB of memory, 64 GB disk), and provides 1 executor per worker. We recommend this worker type for memory-intensive jobs.

    • For the G.2X worker type, each worker maps to 2 DPU (8 vCPU, 32 GB of memory, 128 GB disk), and provides 1 executor per worker. We recommend this worker type for memory-intensive jobs.

  • NumberOfWorkers (integer) --

    The number of workers of a defined workerType that are allocated when a job runs.

    The maximum number of workers you can define are 299 for G.1X, and 149 for G.2X.

  • SecurityConfiguration (string) --

    The name of the SecurityConfiguration structure to be used with this job.

  • NotificationProperty (dict) --

    Specifies the configuration properties of a job notification.

    • NotifyDelayAfter (integer) --

      After a job run starts, the number of minutes to wait before sending a job run delay notification.

  • GlueVersion (string) --

    Glue version determines the versions of Apache Spark and Python that AWS Glue supports. The Python version indicates the version supported for jobs of type Spark.

    For more information about the available AWS Glue versions and corresponding Spark and Python versions, see Glue version in the developer guide.

rtype:

dict

returns:

Response Syntax

{
    'JobName': 'string'
}

Response Structure

  • (dict) --

    • JobName (string) --

      Returns the name of the updated job definition.