boto3 and boto
boto3
Getting started
- Create a service account.
- Add the service account to a group to grant it necessary permissions.
- Create a static access key.
Installing
To install boto, use the instructions in the developer's repository: boto3
Setup
Locally
-
Go to the
~/.aws/
directory (for macOS and Linux) orC:\Users\<username>\.aws\
(for Windows). -
Create a file named
credentials
with authentication data for Object Storage and copy the following information to it:[default] aws_access_key_id = <static_key_ID> aws_secret_access_key = <secret_key>
-
Create a file named
config
with the default region parameters and copy the following information to it:[default] region=eu-north1
Note
Some apps designed to work with Amazon S3 don't let you set the region, so Object Storage also accepts the value
us-east-1
.
To access Object Storage, use the https://storage.ai.nebius.cloud
endpoint.
Example
boto3
boto
#!/usr/bin/env python
#-*- coding: utf-8 -*-
import boto3
session = boto3.session.Session()
s3 = session.client(
service_name='s3',
endpoint_url='https://storage.ai.nebius.cloud'
)
# Creating a new bucket
s3.create_bucket(Bucket='bucket-name')
# Uploading objects into the bucket
## From a string
s3.put_object(Bucket='bucket-name', Key='object_name', Body='TEST', StorageClass='STANDARD')
## From a file
s3.upload_file('this_script.py', 'bucket-name', 'py_script.py')
s3.upload_file('this_script.py', 'bucket-name', 'script/py_script.py')
# Getting a list of objects in the bucket
for key in s3.list_objects(Bucket='bucket-name')['Contents']:
print(key['Key'])
# Deleting multiple objects
forDeletion = [{'Key':'object_name'}, {'Key':'script/py_script.py'}]
response = s3.delete_objects(Bucket='bucket-name', Delete={'Objects': forDeletion})
# Retrieving an object
get_object_response = s3.get_object(Bucket='bucket-name',Key='py_script.py')
print(get_object_response['Body'].read())
#!/usr/bin/env python
#-*- coding: utf-8 -*-
import os
from boto.s3.key import Key
from boto.s3.connection import S3Connection
os.environ['S3_USE_SIGV4'] = 'True'
conn = S3Connection(
host='storage.ai.nebius.cloud'
)
conn.auth_region_name = 'eu-north1'
# Create a new bucket
conn.create_bucket('bucket-name')
bucket = conn.get_bucket('bucket-name')
# Uploading objects into the bucket
## From a string
bucket.new_key('test-string').set_contents_from_string('TEST')
## From a file
file_key_1 = Key(bucket)
file_key_1.key = 'py_script.py'
file_key_1.set_contents_from_filename('this_script.py')
file_key_2 = Key(bucket)
file_key_2.key = 'script/py_script.py'
file_key_2.set_contents_from_filename('this_script.py')
# Getting a list of objects in the bucket
keys_list=bucket.list()
for key in keys_list:
print (key.key)
# Deleting multiple objects
response = bucket.delete_keys(['test-string', 'py_script.py'])
# Retrieving an object
key = bucket.get_key('script/py_script.py')
print (key.get_contents_as_string())