程序師世界是廣大編程愛好者互助、分享、學習的平台,程序師世界有你更精彩!
首頁
編程語言
C語言|JAVA編程
Python編程
網頁編程
ASP編程|PHP編程
JSP編程
數據庫知識
MYSQL數據庫|SqlServer數據庫
Oracle數據庫|DB2數據庫
您现在的位置: 程式師世界 >> 編程語言 >  >> 更多編程語言 >> Python

AWS lambda Python upload S3

編輯:Python

Code writing

Code writing

To write lambda function

Write lambda functions

The main function is to query the database , Generate... Locally test.csv, Then upload it to s3://test-bucket-dev bucket ,bthlt Under the table of contents . test.csv is generated locally and uploaded to s3://test-bucket-dev bucket,bthlt path.

import pymysql
import logging
import boto3
from botocore.exceptions import ClientError
import os
db = pymysql.connect(host='****.****',
user='****',
password='****',
database='****')
cursor = db.cursor()
def cursor_query_all(sql):
try:
cursor.execute(sql)
except Exception as e:
print("Catch exception : " + str(e))
return cursor.fetchall()
def get_db_data():
sql = """select * from test"""
result = cursor_query_all(sql)
return result
def upload_file(file_name, bucket, object_name=None):
if object_name is None:
object_name = os.path.basename(file_name)
s3_client = boto3.client('s3')
try:
s3_client.upload_file(file_name, bucket, object_name)
except ClientError as e:
logging.error(e)
return False
return True
def lambda_handler(event, context):
with open('/tmp/test.csv', 'w') as v_file:
results = get_db_data()
v_file.write('head1,head2,head3' + '\n')
for result in results:
v_file.write(','.join(result) + '\n')
upload_file('/tmp/test.csv', 'test-bucket-dev', 'bthlt/test.csv')

terraform Deploy

terraform deployment

Write dependencies requirements.txt file

write the dependency requirements.txt file

requirements.txt

boto3==1.20.23
PyMySQL==1.0.2
botocore==1.23.23

Write packaged functions and run them , Generate test.zip

Write the package function and run it to generate test.zip

package.sh

#!/bin/bash
mkdir deploy
cp test.py deploy
cp requirements.txt deploy
cd deploy
pip install -r requirements.txt -t./
zip -r ../test.zip *
rm -rf ../deploy

To write terraform, Realize automatic upload , And pass aws event Every day at regular intervals

Write terraform to realize automatic upload and execute it daily through AWS Events

lambda.tf

resource "aws_iam_role" "test" {
assume_role_policy = jsonencode(
{
Statement = [
{
Action = "sts:AssumeRole"
Effect = "Allow"
Principal = {
Service = "lambda.amazonaws.com"
}
},
]
Version = "2021-10-17"
}
)
force_detach_policies = false
max_session_duration = 3600
name = "test"
path = "/service-role/"
}
resource "aws_lambda_function" "test" {
function_name = "test-upload-s3"
handler = "test.lambda_handler"
role = aws_iam_role.test.arn
runtime = "python3.8"
memory_size = "128"
filename = "lambda/test.zip"
source_code_hash = filebase64sha256("lambda/test.zip")
}

event.tf

resource "aws_cloudwatch_event_rule" "every_day_upload_file_hours" {
name = "test-file-every-day-${terraform.workspace}"
schedule_expression = "cron(0 1 * * ? *)"
}
resource "aws_cloudwatch_event_target" "event_target_upload_files_s3" {
count = terraform.workspace == "prod" ? 1 : 0
target_id = "every_day_upload_file_hours"
rule = aws_cloudwatch_event_rule.every_day_upload_file_hours.name
arn = "arn:aws-cn:lambda:region:account_id:function:test-upload-s3"
}
resource "aws_lambda_permission" "lambda_permission_upload_files_s3" {
count = terraform.workspace == "prod" ? 1 : 0
action = "lambda:InvokeFunction"
function_name = "test-upload-s3"
principal = "events.amazonaws.com"
source_arn = aws_cloudwatch_event_rule.every_day_upload_file_hours.arn
}

  1. 上一篇文章:
  2. 下一篇文章:
Copyright © 程式師世界 All Rights Reserved