Join the Community
and take part in the story

AWS S3 CLI access/secret key


#1

Hello,

I’m following this page to test S3 connection: http://docs.openio.io/user-guide/awscli.html

I’m wondering which field is access key and which is secret key.

Source your credentials and get a token:

root@gateway-2:~# openstack ec2 credentials create
+------------+----------------------------------+
| Field      | Value                            |
+------------+----------------------------------+
| access     | a377318b35dd4fb39b291b5b248958c9 |
| project_id | ad61d5fc98f64ea09932ee656bfbd3d9 |
| secret     | b05c7fa607e045c09a6c5c78b9ca6990 |
| trust_id   | None                             |
| user_id    | 745ae085fd5f49e8af0106055ef06d5d |
+------------+----------------------------------+

Replace ACCESS_KEY and SECRET_KEY with the result of the previous command:

root@gateway-2:~/.aws# cat credentials 
[default]
aws_access_key_id=a377318b35dd4fb39b291b5b248958c9
aws_secret_access_key=b05c7fa607e045c09a6c5c78b9ca6990

I’m not sure if the replacement I made in credentials is correct or not as it looks strange that ‘openstack ec2 credentials create’ generate access and secret with equal length.

Yongsheng


#2

Hi,

Your configuration file is correct.
You can now use the awscli command.

Regards,
Sébastien


#3

Hi Sébastien,

But I can’t get success to make the client access proxy.

root@gateway-2:~# aws --endpoint-url http://192.168.2.96:6007 --no-verify-ssl s3api create-bucket --bucket firstbucket
An error occurred (InternalError) when calling the CreateBucket operation (reached max retries: 4): unexpected status code 404

#4

Could you run your command with the --debug option please.


#5

OK, that’s a lot.

root@gateway-2:~# aws --endpoint-url http://192.168.2.96:6007 --no-verify-ssl s3api create-bucket --bucket firstbucket --debug
2017-12-26 17:57:30,205 - MainThread - awscli.clidriver - DEBUG - CLI version: aws-cli/1.11.13 Python/3.5.2 Linux/4.4.0-62-generic botocore/1.4.70
2017-12-26 17:57:30,207 - MainThread - awscli.clidriver - DEBUG - Arguments entered to CLI: ['--endpoint-url', 'http://192.168.2.96:6007', '--no-verify-ssl', 's3api', 'create-bucket', '--bucket', 'firstbucket', '--debug']
2017-12-26 17:57:30,207 - MainThread - botocore.hooks - DEBUG - Event session-initialized: calling handler <function add_scalar_parsers at 0x7fe5e7a16048>
2017-12-26 17:57:30,207 - MainThread - botocore.hooks - DEBUG - Event session-initialized: calling handler <function inject_assume_role_provider_cache at 0x7fe5e68070d0>
2017-12-26 17:57:30,208 - MainThread - botocore.credentials - DEBUG - Skipping environment variable credential check because profile name was explicitly set.
2017-12-26 17:57:30,210 - MainThread - botocore.loaders - DEBUG - Loading JSON file: /usr/lib/python3/dist-packages/botocore/data/s3/2006-03-01/service-2.json
2017-12-26 17:57:30,223 - MainThread - botocore.hooks - DEBUG - Event service-data-loaded.s3: calling handler <function register_retries_for_service at 0x7fe5e6bea9d8>
2017-12-26 17:57:30,223 - MainThread - botocore.handlers - DEBUG - Registering retry handlers for service: s3
2017-12-26 17:57:30,227 - MainThread - botocore.hooks - DEBUG - Event building-command-table.s3api: calling handler <function add_waiters at 0x7fe5e7a1f400>
2017-12-26 17:57:30,230 - MainThread - botocore.loaders - DEBUG - Loading JSON file: /usr/lib/python3/dist-packages/botocore/data/s3/2006-03-01/waiters-2.json
2017-12-26 17:57:30,232 - MainThread - awscli.clidriver - DEBUG - OrderedDict([('acl', <awscli.arguments.CLIArgument object at 0x7fe5e6175d68>), ('bucket', <awscli.arguments.CLIArgument object at 0x7fe5e6175dd8>), ('create-bucket-configuration', <awscli.arguments.CLIArgument object at 0x7fe5e6175e10>), ('grant-full-control', <awscli.arguments.CLIArgument object at 0x7fe5e6175e48>), ('grant-read', <awscli.arguments.CLIArgument object at 0x7fe5e6175eb8>), ('grant-read-acp', <awscli.arguments.CLIArgument object at 0x7fe5e6175e80>), ('grant-write', <awscli.arguments.CLIArgument object at 0x7fe5e6175ef0>), ('grant-write-acp', <awscli.arguments.CLIArgument object at 0x7fe5e6175f28>)])
2017-12-26 17:57:30,233 - MainThread - botocore.hooks - DEBUG - Event building-argument-table.s3api.create-bucket: calling handler <function add_streaming_output_arg at 0x7fe5e7a162f0>
2017-12-26 17:57:30,233 - MainThread - botocore.hooks - DEBUG - Event building-argument-table.s3api.create-bucket: calling handler <function add_cli_input_json at 0x7fe5e6807a60>
2017-12-26 17:57:30,234 - MainThread - botocore.hooks - DEBUG - Event building-argument-table.s3api.create-bucket: calling handler <function unify_paging_params at 0x7fe5e6417730>
2017-12-26 17:57:30,237 - MainThread - botocore.loaders - DEBUG - Loading JSON file: /usr/lib/python3/dist-packages/botocore/data/s3/2006-03-01/paginators-1.json
2017-12-26 17:57:30,237 - MainThread - botocore.hooks - DEBUG - Event building-argument-table.s3api.create-bucket: calling handler <function add_generate_skeleton at 0x7fe5e6481730>
2017-12-26 17:57:30,238 - MainThread - botocore.hooks - DEBUG - Event before-building-argument-table-parser.s3api.create-bucket: calling handler <bound method OverrideRequiredArgsArgument.override_required_args of <awscli.customizations.cliinputjson.CliInputJSONArgument object at 0x7fe5e6175f98>>
2017-12-26 17:57:30,238 - MainThread - botocore.hooks - DEBUG - Event before-building-argument-table-parser.s3api.create-bucket: calling handler <bound method OverrideRequiredArgsArgument.override_required_args of <awscli.customizations.generatecliskeleton.GenerateCliSkeletonArgument object at 0x7fe5e6175fd0>>
2017-12-26 17:57:30,240 - MainThread - botocore.hooks - DEBUG - Event load-cli-arg.s3.create-bucket.acl: calling handler <function uri_param at 0x7fe5e6838b70>
2017-12-26 17:57:30,240 - MainThread - botocore.hooks - DEBUG - Event load-cli-arg.s3.create-bucket.bucket: calling handler <function uri_param at 0x7fe5e6838b70>
2017-12-26 17:57:30,240 - MainThread - botocore.hooks - DEBUG - Event process-cli-arg.s3.create-bucket: calling handler <awscli.argprocess.ParamShorthandParser object at 0x7fe5e67de6a0>
2017-12-26 17:57:30,240 - MainThread - awscli.arguments - DEBUG - Unpacked value of 'firstbucket' for parameter "bucket": 'firstbucket'
2017-12-26 17:57:30,241 - MainThread - botocore.hooks - DEBUG - Event load-cli-arg.s3.create-bucket.create-bucket-configuration: calling handler <function uri_param at 0x7fe5e6838b70>
2017-12-26 17:57:30,241 - MainThread - botocore.hooks - DEBUG - Event load-cli-arg.s3.create-bucket.grant-full-control: calling handler <function uri_param at 0x7fe5e6838b70>
2017-12-26 17:57:30,241 - MainThread - botocore.hooks - DEBUG - Event load-cli-arg.s3.create-bucket.grant-read: calling handler <function uri_param at 0x7fe5e6838b70>
2017-12-26 17:57:30,241 - MainThread - botocore.hooks - DEBUG - Event load-cli-arg.s3.create-bucket.grant-read-acp: calling handler <function uri_param at 0x7fe5e6838b70>
2017-12-26 17:57:30,241 - MainThread - botocore.hooks - DEBUG - Event load-cli-arg.s3.create-bucket.grant-write: calling handler <function uri_param at 0x7fe5e6838b70>
2017-12-26 17:57:30,241 - MainThread - botocore.hooks - DEBUG - Event load-cli-arg.s3.create-bucket.grant-write-acp: calling handler <function uri_param at 0x7fe5e6838b70>
2017-12-26 17:57:30,242 - MainThread - botocore.hooks - DEBUG - Event load-cli-arg.s3.create-bucket.cli-input-json: calling handler <function uri_param at 0x7fe5e6838b70>
2017-12-26 17:57:30,242 - MainThread - botocore.hooks - DEBUG - Event load-cli-arg.s3.create-bucket.generate-cli-skeleton: calling handler <function uri_param at 0x7fe5e6838b70>
2017-12-26 17:57:30,242 - MainThread - botocore.hooks - DEBUG - Event calling-command.s3api.create-bucket: calling handler <bound method GenerateCliSkeletonArgument.generate_json_skeleton of <awscli.customizations.generatecliskeleton.GenerateCliSkeletonArgument object at 0x7fe5e6175fd0>>
2017-12-26 17:57:30,242 - MainThread - botocore.hooks - DEBUG - Event calling-command.s3api.create-bucket: calling handler <bound method CliInputJSONArgument.add_to_call_parameters of <awscli.customizations.cliinputjson.CliInputJSONArgument object at 0x7fe5e6175f98>>
2017-12-26 17:57:30,242 - MainThread - botocore.credentials - DEBUG - Looking for credentials via: env
2017-12-26 17:57:30,242 - MainThread - botocore.credentials - DEBUG - Looking for credentials via: assume-role
2017-12-26 17:57:30,243 - MainThread - botocore.credentials - DEBUG - Looking for credentials via: shared-credentials-file
2017-12-26 17:57:30,244 - MainThread - botocore.credentials - INFO - Found credentials in shared credentials file: ~/.aws/credentials
2017-12-26 17:57:30,244 - MainThread - botocore.loaders - DEBUG - Loading JSON file: /usr/lib/python3/dist-packages/botocore/data/endpoints.json
2017-12-26 17:57:30,249 - MainThread - botocore.client - DEBUG - Registering retry handlers for service: s3
2017-12-26 17:57:30,253 - MainThread - botocore.hooks - DEBUG - Event creating-client-class.s3: calling handler <function add_generate_presigned_post at 0x7fe5e7098bf8>
2017-12-26 17:57:30,254 - MainThread - botocore.hooks - DEBUG - Event creating-client-class.s3: calling handler <function add_generate_presigned_url at 0x7fe5e7098268>
2017-12-26 17:57:30,261 - MainThread - botocore.endpoint - DEBUG - Setting s3 timeout as (60, 60)
2017-12-26 17:57:30,262 - MainThread - botocore.client - DEBUG - Using S3 path style addressing.
2017-12-26 17:57:30,262 - MainThread - botocore.hooks - DEBUG - Event before-parameter-build.s3.CreateBucket: calling handler <function validate_bucket_name at 0x7fe5e6bea730>
2017-12-26 17:57:30,262 - MainThread - botocore.hooks - DEBUG - Event before-parameter-build.s3.CreateBucket: calling handler <bound method S3RegionRedirector.redirect_from_cache of <botocore.utils.S3RegionRedirector object at 0x7fe5e6144898>>
2017-12-26 17:57:30,263 - MainThread - botocore.hooks - DEBUG - Event before-call.s3.CreateBucket: calling handler <function add_expect_header at 0x7fe5e6beabf8>
2017-12-26 17:57:30,263 - MainThread - botocore.hooks - DEBUG - Event before-call.s3.CreateBucket: calling handler <bound method S3RegionRedirector.set_request_url of <botocore.utils.S3RegionRedirector object at 0x7fe5e6144898>>
2017-12-26 17:57:30,263 - MainThread - botocore.endpoint - DEBUG - Making request for OperationModel(name=CreateBucket) (verify_ssl=False) with params: {'query_string': {}, 'url_path': '/firstbucket', 'url': 'http://192.168.2.96:6007/firstbucket', 'context': {'client_config': <botocore.config.Config object at 0x7fe5e61445f8>, 'has_streaming_input': False, 'client_region': None, 'signing': {'bucket': 'firstbucket'}}, 'headers': {'User-Agent': 'aws-cli/1.11.13 Python/3.5.2 Linux/4.4.0-62-generic botocore/1.4.70'}, 'body': b'', 'method': 'PUT'}
2017-12-26 17:57:30,264 - MainThread - botocore.hooks - DEBUG - Event request-created.s3.CreateBucket: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x7fe5e61441d0>>
2017-12-26 17:57:30,264 - MainThread - botocore.auth - DEBUG - Calculating signature using hmacv1 auth.
2017-12-26 17:57:30,264 - MainThread - botocore.auth - DEBUG - HTTP request method: PUT
2017-12-26 17:57:30,264 - MainThread - botocore.auth - DEBUG - StringToSign:
PUT


Tue, 26 Dec 2017 09:57:30 GMT
/firstbucket
2017-12-26 17:57:30,269 - MainThread - botocore.endpoint - DEBUG - Sending http request: <PreparedRequest [PUT]>
2017-12-26 17:57:39,199 - MainThread - botocore.parsers - DEBUG - Response headers: {'x-amz-id-2': 'tx2386aa14d1584de7a6948-005a421d0a', 'Date': 'Tue, 26 Dec 2017 09:57:39 GMT', 'Transfer-Encoding': 'chunked', 'x-amz-request-id': 'tx2386aa14d1584de7a6948-005a421d0a', 'Content-Type': 'application/xml', 'X-Trans-Id': 'tx2386aa14d1584de7a6948-005a421d0a'}
2017-12-26 17:57:39,200 - MainThread - botocore.parsers - DEBUG - Response body:
b"<?xml version='1.0' encoding='UTF-8'?>\n<Error><Code>InternalError</Code><Message>unexpected status code 404</Message><RequestId>tx2386aa14d1584de7a6948-005a421d0a</RequestId></Error>"
2017-12-26 17:57:39,200 - MainThread - botocore.hooks - DEBUG - Event needs-retry.s3.CreateBucket: calling handler <botocore.retryhandler.RetryHandler object at 0x7fe5e6169550>
2017-12-26 17:57:39,200 - MainThread - botocore.retryhandler - DEBUG - retry needed: retryable HTTP status code received: 500
2017-12-26 17:57:39,201 - MainThread - botocore.retryhandler - DEBUG - Retry needed, action of: 0.4012834946577435
2017-12-26 17:57:39,201 - MainThread - botocore.hooks - DEBUG - Event needs-retry.s3.CreateBucket: calling handler <bound method S3RegionRedirector.redirect_from_error of <botocore.utils.S3RegionRedirector object at 0x7fe5e6144898>>
2017-12-26 17:57:39,201 - MainThread - botocore.endpoint - DEBUG - Response received to retry, sleeping for 0.4012834946577435 seconds
2017-12-26 17:57:39,603 - MainThread - botocore.hooks - DEBUG - Event request-created.s3.CreateBucket: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x7fe5e61441d0>>
2017-12-26 17:57:39,604 - MainThread - botocore.auth - DEBUG - Calculating signature using hmacv1 auth.
2017-12-26 17:57:39,604 - MainThread - botocore.auth - DEBUG - HTTP request method: PUT
2017-12-26 17:57:39,604 - MainThread - botocore.auth - DEBUG - StringToSign:
PUT


Tue, 26 Dec 2017 09:57:39 GMT
/firstbucket
2017-12-26 17:57:39,605 - MainThread - botocore.endpoint - DEBUG - Sending http request: <PreparedRequest [PUT]>
2017-12-26 17:57:48,460 - MainThread - botocore.parsers - DEBUG - Response headers: {'x-amz-id-2': 'tx0272999d697347b596b6e-005a421d13', 'Date': 'Tue, 26 Dec 2017 09:57:48 GMT', 'Transfer-Encoding': 'chunked', 'x-amz-request-id': 'tx0272999d697347b596b6e-005a421d13', 'Content-Type': 'application/xml', 'X-Trans-Id': 'tx0272999d697347b596b6e-005a421d13'}
2017-12-26 17:57:48,460 - MainThread - botocore.parsers - DEBUG - Response body:
b"<?xml version='1.0' encoding='UTF-8'?>\n<Error><Code>InternalError</Code><Message>unexpected status code 404</Message><RequestId>tx0272999d697347b596b6e-005a421d13</RequestId></Error>"
2017-12-26 17:57:48,461 - MainThread - botocore.hooks - DEBUG - Event needs-retry.s3.CreateBucket: calling handler <botocore.retryhandler.RetryHandler object at 0x7fe5e6169550>
2017-12-26 17:57:48,461 - MainThread - botocore.retryhandler - DEBUG - retry needed: retryable HTTP status code received: 500
2017-12-26 17:57:48,461 - MainThread - botocore.retryhandler - DEBUG - Retry needed, action of: 0.9944973329963009
2017-12-26 17:57:48,461 - MainThread - botocore.hooks - DEBUG - Event needs-retry.s3.CreateBucket: calling handler <bound method S3RegionRedirector.redirect_from_error of <botocore.utils.S3RegionRedirector object at 0x7fe5e6144898>>
2017-12-26 17:57:48,461 - MainThread - botocore.endpoint - DEBUG - Response received to retry, sleeping for 0.9944973329963009 seconds
2017-12-26 17:57:49,457 - MainThread - botocore.hooks - DEBUG - Event request-created.s3.CreateBucket: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x7fe5e61441d0>>
2017-12-26 17:57:49,458 - MainThread - botocore.auth - DEBUG - Calculating signature using hmacv1 auth.
2017-12-26 17:57:49,458 - MainThread - botocore.auth - DEBUG - HTTP request method: PUT
2017-12-26 17:57:49,458 - MainThread - botocore.auth - DEBUG - StringToSign:
PUT


Tue, 26 Dec 2017 09:57:49 GMT
/firstbucket
2017-12-26 17:57:49,459 - MainThread - botocore.endpoint - DEBUG - Sending http request: <PreparedRequest [PUT]>
2017-12-26 17:57:58,354 - MainThread - botocore.parsers - DEBUG - Response headers: {'x-amz-id-2': 'tx1446caa189304d7ca7b1b-005a421d1d', 'Date': 'Tue, 26 Dec 2017 09:57:58 GMT', 'Transfer-Encoding': 'chunked', 'x-amz-request-id': 'tx1446caa189304d7ca7b1b-005a421d1d', 'Content-Type': 'application/xml', 'X-Trans-Id': 'tx1446caa189304d7ca7b1b-005a421d1d'}
2017-12-26 17:57:58,354 - MainThread - botocore.parsers - DEBUG - Response body:
b"<?xml version='1.0' encoding='UTF-8'?>\n<Error><Code>InternalError</Code><Message>unexpected status code 404</Message><RequestId>tx1446caa189304d7ca7b1b-005a421d1d</RequestId></Error>"
2017-12-26 17:57:58,354 - MainThread - botocore.hooks - DEBUG - Event needs-retry.s3.CreateBucket: calling handler <botocore.retryhandler.RetryHandler object at 0x7fe5e6169550>
2017-12-26 17:57:58,354 - MainThread - botocore.retryhandler - DEBUG - retry needed: retryable HTTP status code received: 500
2017-12-26 17:57:58,355 - MainThread - botocore.retryhandler - DEBUG - Retry needed, action of: 3.766867640951731
2017-12-26 17:57:58,355 - MainThread - botocore.hooks - DEBUG - Event needs-retry.s3.CreateBucket: calling handler <bound method S3RegionRedirector.redirect_from_error of <botocore.utils.S3RegionRedirector object at 0x7fe5e6144898>>
2017-12-26 17:57:58,355 - MainThread - botocore.endpoint - DEBUG - Response received to retry, sleeping for 3.766867640951731 seconds
2017-12-26 17:58:02,126 - MainThread - botocore.hooks - DEBUG - Event request-created.s3.CreateBucket: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x7fe5e61441d0>>
2017-12-26 17:58:02,127 - MainThread - botocore.auth - DEBUG - Calculating signature using hmacv1 auth.
2017-12-26 17:58:02,127 - MainThread - botocore.auth - DEBUG - HTTP request method: PUT
2017-12-26 17:58:02,127 - MainThread - botocore.auth - DEBUG - StringToSign:
PUT


Tue, 26 Dec 2017 09:58:02 GMT
/firstbucket
2017-12-26 17:58:02,128 - MainThread - botocore.endpoint - DEBUG - Sending http request: <PreparedRequest [PUT]>
2017-12-26 17:58:11,008 - MainThread - botocore.parsers - DEBUG - Response headers: {'x-amz-id-2': 'txe0ba52739ff74a01a16f0-005a421d2a', 'Date': 'Tue, 26 Dec 2017 09:58:11 GMT', 'Transfer-Encoding': 'chunked', 'x-amz-request-id': 'txe0ba52739ff74a01a16f0-005a421d2a', 'Content-Type': 'application/xml', 'X-Trans-Id': 'txe0ba52739ff74a01a16f0-005a421d2a'}
2017-12-26 17:58:11,009 - MainThread - botocore.parsers - DEBUG - Response body:
b"<?xml version='1.0' encoding='UTF-8'?>\n<Error><Code>InternalError</Code><Message>unexpected status code 404</Message><RequestId>txe0ba52739ff74a01a16f0-005a421d2a</RequestId></Error>"
2017-12-26 17:58:11,009 - MainThread - botocore.hooks - DEBUG - Event needs-retry.s3.CreateBucket: calling handler <botocore.retryhandler.RetryHandler object at 0x7fe5e6169550>
2017-12-26 17:58:11,009 - MainThread - botocore.retryhandler - DEBUG - retry needed: retryable HTTP status code received: 500
2017-12-26 17:58:11,009 - MainThread - botocore.retryhandler - DEBUG - Retry needed, action of: 3.5062570353272955
2017-12-26 17:58:11,010 - MainThread - botocore.hooks - DEBUG - Event needs-retry.s3.CreateBucket: calling handler <bound method S3RegionRedirector.redirect_from_error of <botocore.utils.S3RegionRedirector object at 0x7fe5e6144898>>
2017-12-26 17:58:11,010 - MainThread - botocore.endpoint - DEBUG - Response received to retry, sleeping for 3.5062570353272955 seconds
2017-12-26 17:58:14,517 - MainThread - botocore.hooks - DEBUG - Event request-created.s3.CreateBucket: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x7fe5e61441d0>>
2017-12-26 17:58:14,518 - MainThread - botocore.auth - DEBUG - Calculating signature using hmacv1 auth.
2017-12-26 17:58:14,518 - MainThread - botocore.auth - DEBUG - HTTP request method: PUT
2017-12-26 17:58:14,519 - MainThread - botocore.auth - DEBUG - StringToSign:
PUT


Tue, 26 Dec 2017 09:58:14 GMT
/firstbucket
2017-12-26 17:58:14,520 - MainThread - botocore.endpoint - DEBUG - Sending http request: <PreparedRequest [PUT]>
2017-12-26 17:58:23,405 - MainThread - botocore.parsers - DEBUG - Response headers: {'x-amz-id-2': 'tx43a45e6935844073afefe-005a421d36', 'Date': 'Tue, 26 Dec 2017 09:58:23 GMT', 'Transfer-Encoding': 'chunked', 'x-amz-request-id': 'tx43a45e6935844073afefe-005a421d36', 'Content-Type': 'application/xml', 'X-Trans-Id': 'tx43a45e6935844073afefe-005a421d36'}
2017-12-26 17:58:23,406 - MainThread - botocore.parsers - DEBUG - Response body:
b"<?xml version='1.0' encoding='UTF-8'?>\n<Error><Code>InternalError</Code><Message>unexpected status code 404</Message><RequestId>tx43a45e6935844073afefe-005a421d36</RequestId></Error>"
2017-12-26 17:58:23,406 - MainThread - botocore.hooks - DEBUG - Event needs-retry.s3.CreateBucket: calling handler <botocore.retryhandler.RetryHandler object at 0x7fe5e6169550>
2017-12-26 17:58:23,406 - MainThread - botocore.retryhandler - DEBUG - retry needed: retryable HTTP status code received: 500
2017-12-26 17:58:23,406 - MainThread - botocore.retryhandler - DEBUG - Reached the maximum number of retry attempts: 5
2017-12-26 17:58:23,407 - MainThread - botocore.retryhandler - DEBUG - No retry needed.
2017-12-26 17:58:23,407 - MainThread - botocore.hooks - DEBUG - Event needs-retry.s3.CreateBucket: calling handler <bound method S3RegionRedirector.redirect_from_error of <botocore.utils.S3RegionRedirector object at 0x7fe5e6144898>>
2017-12-26 17:58:23,407 - MainThread - botocore.hooks - DEBUG - Event after-call.s3.CreateBucket: calling handler <function enhance_error_msg at 0x7fe5e79ffc80>
2017-12-26 17:58:23,407 - MainThread - awscli.clidriver - DEBUG - Exception caught in main()
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/awscli/clidriver.py", line 186, in main
    return command_table[parsed_args.command](remaining, parsed_args)
  File "/usr/lib/python3/dist-packages/awscli/clidriver.py", line 381, in __call__
    return command_table[parsed_args.operation](remaining, parsed_globals)
  File "/usr/lib/python3/dist-packages/awscli/clidriver.py", line 551, in __call__
    call_parameters, parsed_globals)
  File "/usr/lib/python3/dist-packages/awscli/clidriver.py", line 675, in invoke
    **parameters)
  File "/usr/lib/python3/dist-packages/botocore/client.py", line 251, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/usr/lib/python3/dist-packages/botocore/client.py", line 537, in _make_api_call
    raise ClientError(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (InternalError) when calling the CreateBucket operation (reached max retries: 4): unexpected status code 404
2017-12-26 17:58:23,409 - MainThread - awscli.clidriver - DEBUG - Exiting with rc 255

An error occurred (InternalError) when calling the CreateBucket operation (reached max retries: 4): unexpected status code 404

#6

/var/log/syslog

Dec 26 17:58:52 gateway-2 puppet-agent[13601]: Could not request certificate: Failed to open TCP connection to puppet:8140 (getaddrinfo: Name or service not known)


#7

Can you grep one of transition_id in oioswift log please ?
grep tx2386aa14d1584de7a6948-005a421d0a /var/log/oio/sds/OPENIO/oioswift-0/oioswift-0.log , if your namespace name is OPENIO.


#8

Yes.

root@gateway-2:~# grep tx2386aa14d1584de7a6948-005a421d0a /var/log/oio/sds/OPENIO/oioswift-0/oioswift-0.log
2017-12-26T17:57:32.452303+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  ERROR Unhandled exception in request: #012Traceback (most recent call last):#012  File "/usr/lib/python2.7/dist-packages/swift/proxy/server.py", line 411, in handle_request#012    return handler(req)#012  File "/usr/lib/python2.7/dist-packages/oioswift/proxy/controllers/account.py", line 167, in HEAD#012    resp = self.get_account_head_resp(req)#012  File "/usr/lib/python2.7/dist-packages/oioswift/proxy/controllers/account.py", line 180, in get_account_head_resp#012    info = self.app.storage.account_show(self.account_name)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 59, in _wrapped#012    return fnc(self, account, *args, **kwargs)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 213, in account_show#012    headers=headers)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 669, in _account_request#012    all_urls = self._get_service_url('account')#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 658, in _get_service_url#012    resp, resp_body = self._request('GET', uri, params=params)#012  File "/usr/lib/python2.7/dist-packages/oio/api/base.py", line 55, in _request#012    **kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 468, in request#012    resp = self.send(prep, **send_kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 576, in send#012    r = adapter.send(request, **kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 432, in send#012    raise ConnectTimeout(e, request=request)#012ConnectTimeout: HTTPConnectionPool(host='192.168.4.96', port=6006): Max retries exceeded with url: /v3.0/OPENIO/lb/choose?type=account (Caused by ConnectTimeoutError(<requests.packages.urllib3.connection.HTTPConnection object at 0x7fdf42ffa790>, 'Connection to 192.168.4.96 timed out. (connect timeout=2.0)')) (txn: tx2386aa14d1584de7a6948-005a421d0a)
2017-12-26T17:57:32.453899+08:00 gateway-2 OIO,OPENIO,oioswift,0: info  - - 26/Dec/2017/09/57/32 HEAD /v1/AUTH_ad61d5fc98f64ea09932ee656bfbd3d9 HTTP/1.0 500 - Swift - - - - tx2386aa14d1584de7a6948-005a421d0a - 2.0032 RL - 1514282250.449287891 1514282252.452496052 -
2017-12-26T17:57:34.684201+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  ERROR Unhandled exception in request: #012Traceback (most recent call last):#012  File "/usr/lib/python2.7/dist-packages/swift/proxy/server.py", line 411, in handle_request#012    return handler(req)#012  File "/usr/lib/python2.7/dist-packages/oioswift/proxy/controllers/account.py", line 167, in HEAD#012    resp = self.get_account_head_resp(req)#012  File "/usr/lib/python2.7/dist-packages/oioswift/proxy/controllers/account.py", line 180, in get_account_head_resp#012    info = self.app.storage.account_show(self.account_name)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 59, in _wrapped#012    return fnc(self, account, *args, **kwargs)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 213, in account_show#012    headers=headers)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 669, in _account_request#012    all_urls = self._get_service_url('account')#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 658, in _get_service_url#012    resp, resp_body = self._request('GET', uri, params=params)#012  File "/usr/lib/python2.7/dist-packages/oio/api/base.py", line 55, in _request#012    **kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 468, in request#012    resp = self.send(prep, **send_kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 576, in send#012    r = adapter.send(request, **kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 432, in send#012    raise ConnectTimeout(e, request=request)#012ConnectTimeout: HTTPConnectionPool(host='192.168.4.96', port=6006): Max retries exceeded with url: /v3.0/OPENIO/lb/choose?type=account (Caused by ConnectTimeoutError(<requests.packages.urllib3.connection.HTTPConnection object at 0x7fdf42ff1110>, 'Connection to 192.168.4.96 timed out. (connect timeout=2.0)')) (txn: tx2386aa14d1584de7a6948-005a421d0a)
2017-12-26T17:57:34.685481+08:00 gateway-2 OIO,OPENIO,oioswift,0: info  - - 26/Dec/2017/09/57/34 HEAD /v1/AUTH_ad61d5fc98f64ea09932ee656bfbd3d9 HTTP/1.0 500 - Swift - - - - tx2386aa14d1584de7a6948-005a421d0a - 2.0030 KS - 1514282252.681417942 1514282254.684391022 -
2017-12-26T17:57:36.688876+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  ERROR Unhandled exception in request: #012Traceback (most recent call last):#012  File "/usr/lib/python2.7/dist-packages/swift/proxy/server.py", line 411, in handle_request#012    return handler(req)#012  File "/usr/lib/python2.7/dist-packages/oioswift/proxy/controllers/account.py", line 167, in HEAD#012    resp = self.get_account_head_resp(req)#012  File "/usr/lib/python2.7/dist-packages/oioswift/proxy/controllers/account.py", line 180, in get_account_head_resp#012    info = self.app.storage.account_show(self.account_name)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 59, in _wrapped#012    return fnc(self, account, *args, **kwargs)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 213, in account_show#012    headers=headers)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 669, in _account_request#012    all_urls = self._get_service_url('account')#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 658, in _get_service_url#012    resp, resp_body = self._request('GET', uri, params=params)#012  File "/usr/lib/python2.7/dist-packages/oio/api/base.py", line 55, in _request#012    **kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 468, in request#012    resp = self.send(prep, **send_kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 576, in send#012    r = adapter.send(request, **kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 432, in send#012    raise ConnectTimeout(e, request=request)#012ConnectTimeout: HTTPConnectionPool(host='192.168.4.96', port=6006): Max retries exceeded with url: /v3.0/OPENIO/lb/choose?type=account (Caused by ConnectTimeoutError(<requests.packages.urllib3.connection.HTTPConnection object at 0x7fdf42ffa710>, 'Connection to 192.168.4.96 timed out. (connect timeout=2.0)')) (txn: tx2386aa14d1584de7a6948-005a421d0a)
2017-12-26T17:57:37.190722+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  ERROR with Container server 10.0.0.1:1001/sdb re: Trying to PUT /AUTH_ad61d5fc98f64ea09932ee656bfbd3d9: ConnectionTimeout (0.5s) (txn: tx2386aa14d1584de7a6948-005a421d0a)
2017-12-26T17:57:37.191171+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  ERROR with Container server 10.0.0.0:1000/sda re: Trying to PUT /AUTH_ad61d5fc98f64ea09932ee656bfbd3d9: ConnectionTimeout (0.5s) (txn: tx2386aa14d1584de7a6948-005a421d0a)
2017-12-26T17:57:37.191673+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  ERROR with Container server 10.0.0.2:1002/sdc re: Trying to PUT /AUTH_ad61d5fc98f64ea09932ee656bfbd3d9: ConnectionTimeout (0.5s) (txn: tx2386aa14d1584de7a6948-005a421d0a)
2017-12-26T17:57:37.192284+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  Container GET returning 503 for (503, 503, 503) (txn: tx2386aa14d1584de7a6948-005a421d0a)
2017-12-26T17:57:37.192600+08:00 gateway-2 OIO,OPENIO,oioswift,0: warning  Could not autocreate account '/AUTH_ad61d5fc98f64ea09932ee656bfbd3d9' (txn: tx2386aa14d1584de7a6948-005a421d0a)
2017-12-26T17:57:39.196433+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  ERROR Unhandled exception in request: #012Traceback (most recent call last):#012  File "/usr/lib/python2.7/dist-packages/swift/proxy/server.py", line 411, in handle_request#012    return handler(req)#012  File "/usr/lib/python2.7/dist-packages/oioswift/proxy/controllers/account.py", line 167, in HEAD#012    resp = self.get_account_head_resp(req)#012  File "/usr/lib/python2.7/dist-packages/oioswift/proxy/controllers/account.py", line 180, in get_account_head_resp#012    info = self.app.storage.account_show(self.account_name)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 59, in _wrapped#012    return fnc(self, account, *args, **kwargs)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 213, in account_show#012    headers=headers)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 669, in _account_request#012    all_urls = self._get_service_url('account')#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 658, in _get_service_url#012    resp, resp_body = self._request('GET', uri, params=params)#012  File "/usr/lib/python2.7/dist-packages/oio/api/base.py", line 55, in _request#012    **kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 468, in request#012    resp = self.send(prep, **send_kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 576, in send#012    r = adapter.send(request, **kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 432, in send#012    raise ConnectTimeout(e, request=request)#012ConnectTimeout: HTTPConnectionPool(host='192.168.4.96', port=6006): Max retries exceeded with url: /v3.0/OPENIO/lb/choose?type=account (Caused by ConnectTimeoutError(<requests.packages.urllib3.connection.HTTPConnection object at 0x7fdf42ffa910>, 'Connection to 192.168.4.96 timed out. (connect timeout=2.0)')) (txn: tx2386aa14d1584de7a6948-005a421d0a)
2017-12-26T17:57:39.197860+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  500 Internal Server Error: #012Traceback (most recent call last):#012  File "/usr/lib/python2.7/dist-packages/swift3/middleware.py", line 81, in __call__#012    resp = self.handle_request(req)#012  File "/usr/lib/python2.7/dist-packages/swift3/middleware.py", line 104, in handle_request#012    res = getattr(controller, req.method)(req)#012  File "/usr/lib/python2.7/dist-packages/swift3/controllers/bucket.py", line 146, in PUT#012    resp = req.get_response(self.app)#012  File "/usr/lib/python2.7/dist-packages/swift3/request.py", line 686, in get_response#012    headers, body, query)#012  File "/usr/lib/python2.7/dist-packages/swift3/request.py", line 671, in _get_response#012    raise InternalError('unexpected status code %d' % status)#012InternalError: 500 Internal Server Error (txn: tx2386aa14d1584de7a6948-005a421d0a)
2017-12-26T17:57:39.198685+08:00 gateway-2 OIO,OPENIO,oioswift,0: info  192.168.2.96 192.168.2.96 26/Dec/2017/09/57/39 PUT /firstbucket HTTP/1.0 500 - aws-cli/1.11.13%20Python/3.5.2%20Linux/4.4.0-62-generic%20botocore/1.4.70 - - 182 - tx2386aa14d1584de7a6948-005a421d0a - 8.9250 - - 1514282250.273241043 1514282259.198261023 -

#9

I haven;t setup any rawx yet. I’m also wondering this part in installation page:

Puppet Manifest

Here is an example manifest you can tune to your own settings:

OPENIO_PROXY_URL should point to an oioproxy service. 6006 is the default port, so you can just change the OIO_SERVER to another server where OpenIO is installed.

In my env, Gateway is installed on a standalone node which doesn’t have any connection with openIO nodes. So what does OPENIO_PROXY_URL mean? Does it mean the node on which the Gateway is installed ? Should it be an openIO node ? I didn’t use openIO node to replace OPENIO_PROXY_URL . I used Gateway node to replace it.


#11

Ok, you must have a running namespace behind your oioswift service.
Then if you install oioswift on separate node, OPENIO_PROXY_URL must be the IP of one of your openIO node.


#12

OK, it might be the cause. I will replace it and re-apply.


#13

Hi Sébastien,

I added a disk but it seems not successful. This part is what I added into openio.pp.

openiosds::rawx {'rawx-20':
  ns        => 'OPENIO',
  ipaddress => $ipaddress,
  num => '20',
  port => '6050',
  documentRoot => '/mnt/sdb',
}
openiosds::rdir {'rdir-20':
  ns        => 'OPENIO',
  ipaddress => $ipaddress,
  num => '20',
  port => '6650',
}
openiosds::oioblobindexer { 'oio-blob-indexer-rawx-20':
  num => '20',
  ns      => 'OPENIO',
  volume => '/mnt/sdb',
}

After openio.pp applied, I list rawx to check if it has been added:

[root@io-x02 ~]# openio --oio-ns=OPENIO cluster list rawx
+------+-------------------+--------------------------------+----------+-------+------+-------+
| Type | Id                | Volume                         | Location | Slots | Up   | Score |
+------+-------------------+--------------------------------+----------+-------+------+-------+
| rawx | 192.168.2.91:6004 | /var/lib/oio/sds/OPENIO/rawx-0 | io-x01   | n/a   | True |    93 |
| rawx | 192.168.2.93:6004 | /var/lib/oio/sds/OPENIO/rawx-0 | io-x03   | n/a   | True |    93 |
| rawx | 192.168.2.92:6004 | /var/lib/oio/sds/OPENIO/rawx-0 | io-x02   | n/a   | True |    94 |
+------+-------------------+--------------------------------+----------+-------+------+-------+

I suppose there should be rawx-20 with 192.168.2.92:6050 that shows up in the list, but I can’t see it I did mkfs,xfs and mounted it.

Best regards,
Yongsheng


#14

Hi,
run these commands after the puppet apply:

# gridinit_cmd restart @conscienceagent
# openio --oio-ns=OPENIO cluster unlockall

#15

Great. Now it looks being added in after the two steps you suggested.

[root@io-x03 ~]# openio --oio-ns=OPENIO cluster list rawx
+------+-------------------+--------------------------------+----------+-------+------+-------+
| Type | Id                | Volume                         | Location | Slots | Up   | Score |
+------+-------------------+--------------------------------+----------+-------+------+-------+
| rawx | 192.168.2.92:6050 | /mnt/sdb                       | io-x02   | n/a   | True |   100 |
| rawx | 192.168.2.91:6050 | /mnt/sdb                       | io-x01   | n/a   | True |   100 |
| rawx | 192.168.2.93:6050 | /mnt/sdb                       | io-x03   | n/a   | True |    99 |
| rawx | 192.168.2.91:6004 | /var/lib/oio/sds/OPENIO/rawx-0 | io-x01   | n/a   | True |    93 |
| rawx | 192.168.2.93:6004 | /var/lib/oio/sds/OPENIO/rawx-0 | io-x03   | n/a   | True |    94 |
| rawx | 192.168.2.92:6004 | /var/lib/oio/sds/OPENIO/rawx-0 | io-x02   | n/a   | True |    94 |
+------+-------------------+--------------------------------+----------+-------+------+-------+

However, I still can’t succeed in creating bucket from gateway node. It reported the same 404 as --debug in previous post.


#16

My cluster is composed of

3 OpenIO nodes (CentOS 7)
- 192.168.2.91
- 192.168.2.92
- 192.168.2.93
1 Gateway node (Ubuntu 16.04)
- 192.168.2.96

The IP info exactly matches those in previous posts.

In openio.pp of 192.168.2.96, I replaced OPENIO_PROXY_URL with 192.168.2.91.

In aws command, I used 192.168.2.96 for --endpoint-url because swift proxy and keystone are installed on 192.168.2.96.


#17

Just wondering why there are three connection tries to container server. I don’t have this kind of ip like 10.0.0.x .

root@gateway-2:/var/log/oio/sds/OPENIO/oioswift-0# grep txb325dbb2c9ac4c3392a1e-005a431aee oioswift-0.log 
2017-12-27T12:00:48.786833+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  ERROR Unhandled exception in request: #012Traceback (most recent call last):#012  File "/usr/lib/python2.7/dist-packages/swift/proxy/server.py", line 411, in handle_request#012    return handler(req)#012  File "/usr/lib/python2.7/dist-packages/oioswift/proxy/controllers/account.py", line 167, in HEAD#012    resp = self.get_account_head_resp(req)#012  File "/usr/lib/python2.7/dist-packages/oioswift/proxy/controllers/account.py", line 180, in get_account_head_resp#012    info = self.app.storage.account_show(self.account_name)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 59, in _wrapped#012    return fnc(self, account, *args, **kwargs)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 213, in account_show#012    headers=headers)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 669, in _account_request#012    all_urls = self._get_service_url('account')#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 658, in _get_service_url#012    resp, resp_body = self._request('GET', uri, params=params)#012  File "/usr/lib/python2.7/dist-packages/oio/api/base.py", line 55, in _request#012    **kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 468, in request#012    resp = self.send(prep, **send_kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 576, in send#012    r = adapter.send(request, **kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 432, in send#012    raise ConnectTimeout(e, request=request)#012ConnectTimeout: HTTPConnectionPool(host='192.168.4.96', port=6006): Max retries exceeded with url: /v3.0/OPENIO/lb/choose?type=account (Caused by ConnectTimeoutError(<requests.packages.urllib3.connection.HTTPConnection object at 0x7fdf42fd2910>, 'Connection to 192.168.4.96 timed out. (connect timeout=2.0)')) (txn: txb325dbb2c9ac4c3392a1e-005a431aee)
2017-12-27T12:00:48.788038+08:00 gateway-2 OIO,OPENIO,oioswift,0: info  - - 27/Dec/2017/04/00/48 HEAD /v1/AUTH_ad61d5fc98f64ea09932ee656bfbd3d9 HTTP/1.0 500 - Swift - - - - txb325dbb2c9ac4c3392a1e-005a431aee - 2.0033 RL - 1514347246.783698082 1514347248.787009001 -
2017-12-27T12:00:50.896207+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  ERROR Unhandled exception in request: #012Traceback (most recent call last):#012  File "/usr/lib/python2.7/dist-packages/swift/proxy/server.py", line 411, in handle_request#012    return handler(req)#012  File "/usr/lib/python2.7/dist-packages/oioswift/proxy/controllers/account.py", line 167, in HEAD#012    resp = self.get_account_head_resp(req)#012  File "/usr/lib/python2.7/dist-packages/oioswift/proxy/controllers/account.py", line 180, in get_account_head_resp#012    info = self.app.storage.account_show(self.account_name)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 59, in _wrapped#012    return fnc(self, account, *args, **kwargs)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 213, in account_show#012    headers=headers)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 669, in _account_request#012    all_urls = self._get_service_url('account')#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 658, in _get_service_url#012    resp, resp_body = self._request('GET', uri, params=params)#012  File "/usr/lib/python2.7/dist-packages/oio/api/base.py", line 55, in _request#012    **kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 468, in request#012    resp = self.send(prep, **send_kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 576, in send#012    r = adapter.send(request, **kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 432, in send#012    raise ConnectTimeout(e, request=request)#012ConnectTimeout: HTTPConnectionPool(host='192.168.4.96', port=6006): Max retries exceeded with url: /v3.0/OPENIO/lb/choose?type=account (Caused by ConnectTimeoutError(<requests.packages.urllib3.connection.HTTPConnection object at 0x7fdf42fd7390>, 'Connection to 192.168.4.96 timed out. (connect timeout=2.0)')) (txn: txb325dbb2c9ac4c3392a1e-005a431aee)
2017-12-27T12:00:50.897347+08:00 gateway-2 OIO,OPENIO,oioswift,0: info  - - 27/Dec/2017/04/00/50 HEAD /v1/AUTH_ad61d5fc98f64ea09932ee656bfbd3d9 HTTP/1.0 500 - Swift - - - - txb325dbb2c9ac4c3392a1e-005a431aee - 2.0029 KS - 1514347248.893450022 1514347250.896384954 -
2017-12-27T12:00:52.901026+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  ERROR Unhandled exception in request: #012Traceback (most recent call last):#012  File "/usr/lib/python2.7/dist-packages/swift/proxy/server.py", line 411, in handle_request#012    return handler(req)#012  File "/usr/lib/python2.7/dist-packages/oioswift/proxy/controllers/account.py", line 167, in HEAD#012    resp = self.get_account_head_resp(req)#012  File "/usr/lib/python2.7/dist-packages/oioswift/proxy/controllers/account.py", line 180, in get_account_head_resp#012    info = self.app.storage.account_show(self.account_name)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 59, in _wrapped#012    return fnc(self, account, *args, **kwargs)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 213, in account_show#012    headers=headers)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 669, in _account_request#012    all_urls = self._get_service_url('account')#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 658, in _get_service_url#012    resp, resp_body = self._request('GET', uri, params=params)#012  File "/usr/lib/python2.7/dist-packages/oio/api/base.py", line 55, in _request#012    **kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 468, in request#012    resp = self.send(prep, **send_kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 576, in send#012    r = adapter.send(request, **kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 432, in send#012    raise ConnectTimeout(e, request=request)#012ConnectTimeout: HTTPConnectionPool(host='192.168.4.96', port=6006): Max retries exceeded with url: /v3.0/OPENIO/lb/choose?type=account (Caused by ConnectTimeoutError(<requests.packages.urllib3.connection.HTTPConnection object at 0x7fdf42fd7290>, 'Connection to 192.168.4.96 timed out. (connect timeout=2.0)')) (txn: txb325dbb2c9ac4c3392a1e-005a431aee)
2017-12-27T12:00:53.403053+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  ERROR with Container server 10.0.0.2:1002/sdc re: Trying to PUT /AUTH_ad61d5fc98f64ea09932ee656bfbd3d9: ConnectionTimeout (0.5s) (txn: txb325dbb2c9ac4c3392a1e-005a431aee)
2017-12-27T12:00:53.403569+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  ERROR with Container server 10.0.0.1:1001/sdb re: Trying to PUT /AUTH_ad61d5fc98f64ea09932ee656bfbd3d9: ConnectionTimeout (0.5s) (txn: txb325dbb2c9ac4c3392a1e-005a431aee)
2017-12-27T12:00:53.404201+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  ERROR with Container server 10.0.0.0:1000/sda re: Trying to PUT /AUTH_ad61d5fc98f64ea09932ee656bfbd3d9: ConnectionTimeout (0.5s) (txn: txb325dbb2c9ac4c3392a1e-005a431aee)
2017-12-27T12:00:53.404907+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  Container GET returning 503 for (503, 503, 503) (txn: txb325dbb2c9ac4c3392a1e-005a431aee)
2017-12-27T12:00:53.405165+08:00 gateway-2 OIO,OPENIO,oioswift,0: warning  Could not autocreate account '/AUTH_ad61d5fc98f64ea09932ee656bfbd3d9' (txn: txb325dbb2c9ac4c3392a1e-005a431aee)
2017-12-27T12:00:55.409051+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  ERROR Unhandled exception in request: #012Traceback (most recent call last):#012  File "/usr/lib/python2.7/dist-packages/swift/proxy/server.py", line 411, in handle_request#012    return handler(req)#012  File "/usr/lib/python2.7/dist-packages/oioswift/proxy/controllers/account.py", line 167, in HEAD#012    resp = self.get_account_head_resp(req)#012  File "/usr/lib/python2.7/dist-packages/oioswift/proxy/controllers/account.py", line 180, in get_account_head_resp#012    info = self.app.storage.account_show(self.account_name)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 59, in _wrapped#012    return fnc(self, account, *args, **kwargs)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 213, in account_show#012    headers=headers)#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 669, in _account_request#012    all_urls = self._get_service_url('account')#012  File "/usr/lib/python2.7/dist-packages/oio/api/object_storage.py", line 658, in _get_service_url#012    resp, resp_body = self._request('GET', uri, params=params)#012  File "/usr/lib/python2.7/dist-packages/oio/api/base.py", line 55, in _request#012    **kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 468, in request#012    resp = self.send(prep, **send_kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 576, in send#012    r = adapter.send(request, **kwargs)#012  File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 432, in send#012    raise ConnectTimeout(e, request=request)#012ConnectTimeout: HTTPConnectionPool(host='192.168.4.96', port=6006): Max retries exceeded with url: /v3.0/OPENIO/lb/choose?type=account (Caused by ConnectTimeoutError(<requests.packages.urllib3.connection.HTTPConnection object at 0x7fdf42fd7550>, 'Connection to 192.168.4.96 timed out. (connect timeout=2.0)')) (txn: txb325dbb2c9ac4c3392a1e-005a431aee)
2017-12-27T12:00:55.410544+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  500 Internal Server Error: #012Traceback (most recent call last):#012  File "/usr/lib/python2.7/dist-packages/swift3/middleware.py", line 81, in __call__#012    resp = self.handle_request(req)#012  File "/usr/lib/python2.7/dist-packages/swift3/middleware.py", line 104, in handle_request#012    res = getattr(controller, req.method)(req)#012  File "/usr/lib/python2.7/dist-packages/swift3/controllers/bucket.py", line 146, in PUT#012    resp = req.get_response(self.app)#012  File "/usr/lib/python2.7/dist-packages/swift3/request.py", line 686, in get_response#012    headers, body, query)#012  File "/usr/lib/python2.7/dist-packages/swift3/request.py", line 671, in _get_response#012    raise InternalError('unexpected status code %d' % status)#012InternalError: 500 Internal Server Error (txn: txb325dbb2c9ac4c3392a1e-005a431aee)
2017-12-27T12:00:55.411367+08:00 gateway-2 OIO,OPENIO,oioswift,0: info  192.168.2.96 192.168.2.96 27/Dec/2017/04/00/55 PUT /firstbucket HTTP/1.0 500 - aws-cli/1.11.13%20Python/3.5.2%20Linux/4.4.0-62-generic%20botocore/1.4.70 - - 182 - txb325dbb2c9ac4c3392a1e-005a431aee - 8.7665 - - 1514347246.644407034 1514347255.410900116 -

#19

Hi Buddy

I rebooted gateway node and therefore related services, and then it works. I’m able to create bucket now. Really appreciate !

Still, I found an error reported in oioswift-0.log :slight_smile:

2017-12-27T14:13:57.436371+08:00 gateway-2 OIO,OPENIO,oioswift,0: err  500 Internal Server Error: #012Traceback (most recent call last):#012  File "/usr/lib/python2.7/dist-packages/swift3/middleware.py", line 81, in __call__#012    resp = self.handle_request(req)#012  File "/usr/lib/python2.7/dist-packages/swift3/middleware.py", line 104, in handle_request#012    res = getattr(controller, req.method)(req)#012  File "/usr/lib/python2.7/dist-packages/swift3/controllers/bucket.py", line 146, in PUT#012    resp = req.get_response(self.app)#012  File "/usr/lib/python2.7/dist-packages/swift3/request.py", line 686, in get_response#012    headers, body, query)#012  File "/usr/lib/python2.7/dist-packages/swift3/request.py", line 671, in _get_response#012    raise InternalError('unexpected status code %d' % status)#012InternalError: 500 Internal Server Error (txn: txe8f254fcd654433891b24-005a433a25) (client_ip: 192.168.2.96)
2017-12-27T14:13:57.437285+08:00 gateway-2 OIO,OPENIO,oioswift,0: info  192.168.2.96 192.168.2.96 27/Dec/2017/06/13/57 PUT /thirdbucket HTTP/1.0 500 - aws-cli/1.11.13%20Python/3.5.2%20Linux/4.4.0-62-generic%20botocore/1.4.70 - - 182 - txe8f254fcd654433891b24-005a433a25 - 0.2435 - - 1514355237.193053007 1514355237.436528921 -
2017-12-27T14:13:57.894929+08:00 gateway-2 OIO,OPENIO,oioswift,0: info  192.168.2.96 192.168.2.96 27/Dec/2017/06/13/57 PUT /thirdbucket HTTP/1.0 200 - aws-cli/1.11.13%20Python/3.5.2%20Linux/4.4.0-62-generic%20botocore/1.4.70 - - - - tx64439424b2fe4906868c6-005a433a25 - 0.3366 - - 1514355237.557826996 1514355237.894429922 -

Does it matter to return 500 ?


#20

Yes it matter, even if it doesn’t affect the data.
Could you show me the output of:
`grep pipeline /etc/oio/sds/OPENIO/oioswift-0/proxy-server.conf``
please.


#21

Here you go:

root@gateway-2:~# grep pipeline /etc/oio/sds/OPENIO/oioswift-0/proxy-server.conf
[pipeline:main]
pipeline = catch_errors gatekeeper healthcheck proxy-logging cache swift3 s3token bulk tempurl ratelimit authtoken keystoneauth container-quotas account-quotas slo dlo versioned_writes proxy-logging proxy-server

#22

ok, could you replace your pipeline by this one :
pipeline = catch_errors proxy-logging gatekeeper healthcheck proxy-logging cache bulk tempurl proxy-logging ratelimit authtoken swift3 s3token keystoneauth staticweb copy container-quotas account-quotas slo dlo versioned_writes proxy-logging proxy-server
And restart oioswift gridinit_cmd restart @oioswift

And could you give me the version of this package puppet-module-openio-openiosds please.
dpkg -l puppet-module-openio-openiosds