2025/10/27 - Amazon Kinesis - 1 new2 updated api methods
Changes Adds support for record sizes up to 10MiB and introduces new UpdateMaxRecordSize API to modify stream record size limits. Adds record size parameters to existing CreateStream and DescribeStreamSummary APIs for request and response payloads respectively.
This allows you to update the MaxRecordSize of a single record that you can write to, and read from a stream. You can ingest and digest single records up to 10240 KiB.
See also: AWS API Documentation
Request Syntax
client.update_max_record_size(
StreamARN='string',
MaxRecordSizeInKiB=123
)
string
The Amazon Resource Name (ARN) of the stream for the MaxRecordSize update.
integer
[REQUIRED]
The maximum record size of a single record in KiB that you can write to, and read from a stream. Specify a value between 1024 and 10240 KiB (1 to 10 MiB). If you specify a value that is out of this range, UpdateMaxRecordSize sends back an ValidationException message.
None
{'MaxRecordSizeInKiB': 'integer'}
Creates a Kinesis data stream. A stream captures and transports data records that are continuously emitted from different data sources or producers. Scale-out within a stream is explicitly supported by means of shards, which are uniquely identified groups of data records in a stream.
You can create your data stream using either on-demand or provisioned capacity mode. Data streams with an on-demand mode require no capacity planning and automatically scale to handle gigabytes of write and read throughput per minute. With the on-demand mode, Kinesis Data Streams automatically manages the shards in order to provide the necessary throughput. For the data streams with a provisioned mode, you must specify the number of shards for the data stream. Each shard can support reads up to five transactions per second, up to a maximum data read total of 2 MiB per second. Each shard can support writes up to 1,000 records per second, up to a maximum data write total of 1 MiB per second. If the amount of data input increases or decreases, you can add or remove shards.
The stream name identifies the stream. The name is scoped to the Amazon Web Services account used by the application. It is also scoped by Amazon Web Services Region. That is, two streams in two different accounts can have the same name, and two streams in the same account, but in two different Regions, can have the same name.
CreateStream is an asynchronous operation. Upon receiving a CreateStream request, Kinesis Data Streams immediately returns and sets the stream status to CREATING. After the stream is created, Kinesis Data Streams sets the stream status to ACTIVE. You should perform read and write operations only on an ACTIVE stream.
You receive a LimitExceededException when making a CreateStream request when you try to do one of the following:
Have more than five streams in the CREATING state at any point in time.
Create more shards than are authorized for your account.
For the default shard limit for an Amazon Web Services account, see Amazon Kinesis Data Streams Limits in the Amazon Kinesis Data Streams Developer Guide. To increase this limit, contact Amazon Web Services Support.
You can use DescribeStreamSummary to check the stream status, which is returned in StreamStatus.
CreateStream has a limit of five transactions per second per account.
You can add tags to the stream when making a CreateStream request by setting the Tags parameter. If you pass the Tags parameter, in addition to having the kinesis:CreateStream permission, you must also have the kinesis:AddTagsToStream permission for the stream that will be created. The kinesis:TagResource permission won’t work to tag streams on creation. Tags will take effect from the CREATING status of the stream, but you can't make any updates to the tags until the stream is in ACTIVE state.
See also: AWS API Documentation
Request Syntax
client.create_stream(
StreamName='string',
ShardCount=123,
StreamModeDetails={
'StreamMode': 'PROVISIONED'|'ON_DEMAND'
},
Tags={
'string': 'string'
},
MaxRecordSizeInKiB=123
)
string
[REQUIRED]
A name to identify the stream. The stream name is scoped to the Amazon Web Services account used by the application that creates the stream. It is also scoped by Amazon Web Services Region. That is, two streams in two different Amazon Web Services accounts can have the same name. Two streams in the same Amazon Web Services account but in two different Regions can also have the same name.
integer
The number of shards that the stream will use. The throughput of the stream is a function of the number of shards; more shards are required for greater provisioned throughput.
dict
Indicates the capacity mode of the data stream. Currently, in Kinesis Data Streams, you can choose between an on-demand capacity mode and a provisioned capacity mode for your data streams.
StreamMode (string) -- [REQUIRED]
Specifies the capacity mode to which you want to set your data stream. Currently, in Kinesis Data Streams, you can choose between an on-demand capacity mode and a provisioned capacity mode for your data streams.
dict
A set of up to 50 key-value pairs to use to create the tags. A tag consists of a required key and an optional value.
(string) --
(string) --
integer
The maximum record size of a single record in kibibyte (KiB) that you can write to, and read from a stream.
None
{'StreamDescriptionSummary': {'MaxRecordSizeInKiB': 'integer'}}
Provides a summarized description of the specified Kinesis data stream without the shard list.
The information returned includes the stream name, Amazon Resource Name (ARN), status, record retention period, approximate creation time, monitoring, encryption details, and open shard count.
DescribeStreamSummary has a limit of 20 transactions per second per account.
See also: AWS API Documentation
Request Syntax
client.describe_stream_summary(
StreamName='string',
StreamARN='string'
)
string
The name of the stream to describe.
string
The ARN of the stream.
dict
Response Syntax
{
'StreamDescriptionSummary': {
'StreamName': 'string',
'StreamARN': 'string',
'StreamStatus': 'CREATING'|'DELETING'|'ACTIVE'|'UPDATING',
'StreamModeDetails': {
'StreamMode': 'PROVISIONED'|'ON_DEMAND'
},
'RetentionPeriodHours': 123,
'StreamCreationTimestamp': datetime(2015, 1, 1),
'EnhancedMonitoring': [
{
'ShardLevelMetrics': [
'IncomingBytes'|'IncomingRecords'|'OutgoingBytes'|'OutgoingRecords'|'WriteProvisionedThroughputExceeded'|'ReadProvisionedThroughputExceeded'|'IteratorAgeMilliseconds'|'ALL',
]
},
],
'EncryptionType': 'NONE'|'KMS',
'KeyId': 'string',
'OpenShardCount': 123,
'ConsumerCount': 123,
'MaxRecordSizeInKiB': 123
}
}
Response Structure
(dict) --
StreamDescriptionSummary (dict) --
A StreamDescriptionSummary containing information about the stream.
StreamName (string) --
The name of the stream being described.
StreamARN (string) --
The Amazon Resource Name (ARN) for the stream being described.
StreamStatus (string) --
The current status of the stream being described. The stream status is one of the following states:
CREATING - The stream is being created. Kinesis Data Streams immediately returns and sets StreamStatus to CREATING.
DELETING - The stream is being deleted. The specified stream is in the DELETING state until Kinesis Data Streams completes the deletion.
ACTIVE - The stream exists and is ready for read and write operations or deletion. You should perform read and write operations only on an ACTIVE stream.
UPDATING - Shards in the stream are being merged or split. Read and write operations continue to work while the stream is in the UPDATING state.
StreamModeDetails (dict) --
Specifies the capacity mode to which you want to set your data stream. Currently, in Kinesis Data Streams, you can choose between an on-demand ycapacity mode and a provisioned capacity mode for your data streams.
StreamMode (string) --
Specifies the capacity mode to which you want to set your data stream. Currently, in Kinesis Data Streams, you can choose between an on-demand capacity mode and a provisioned capacity mode for your data streams.
RetentionPeriodHours (integer) --
The current retention period, in hours.
StreamCreationTimestamp (datetime) --
The approximate time that the stream was created.
EnhancedMonitoring (list) --
Represents the current enhanced monitoring settings of the stream.
(dict) --
Represents enhanced metrics types.
ShardLevelMetrics (list) --
List of shard-level metrics.
The following are the valid shard-level metrics. The value " ALL" enhances every metric.
IncomingBytes
IncomingRecords
OutgoingBytes
OutgoingRecords
WriteProvisionedThroughputExceeded
ReadProvisionedThroughputExceeded
IteratorAgeMilliseconds
ALL
For more information, see Monitoring the Amazon Kinesis Data Streams Service with Amazon CloudWatch in the Amazon Kinesis Data Streams Developer Guide.
(string) --
EncryptionType (string) --
The encryption type used. This value is one of the following:
KMS
NONE
KeyId (string) --
The GUID for the customer-managed Amazon Web Services KMS key to use for encryption. This value can be a globally unique identifier, a fully specified ARN to either an alias or a key, or an alias name prefixed by "alias/".You can also use a master key owned by Kinesis Data Streams by specifying the alias aws/kinesis.
Key ARN example: arn:aws:kms:us-east-1:123456789012:key/12345678-1234-1234-1234-123456789012
Alias ARN example: arn:aws:kms:us-east-1:123456789012:alias/MyAliasName
Globally unique key ID example: 12345678-1234-1234-1234-123456789012
Alias name example: alias/MyAliasName
Master key owned by Kinesis Data Streams: alias/aws/kinesis
OpenShardCount (integer) --
The number of open shards in the stream.
ConsumerCount (integer) --
The number of enhanced fan-out consumers registered with the stream.
MaxRecordSizeInKiB (integer) --
The maximum record size of a single record in kibibyte (KiB) that you can write to, and read from a stream.