Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🎉 Source BigCommerce: add Products Stream #9516

Merged

Conversation

guidoturtu
Copy link
Contributor

@guidoturtu guidoturtu commented Jan 14, 2022

What

Add to source/bigcommerce the catalog/products stream

How

based on the customers stream, added products stream

  • The json/schema was downloaded directly from BigCommerce source and do not be manipulated to avoid errors.
    API REFERENCE.
  • Modify source.py code.
    • Override order_field, remove the asc qualifier for sort, because the stream do not allow it.

Pre-merge Checklist

Expand the relevant checklist and delete the others.

Updating a connector

Community member or Airbyter

  • Grant edit access to maintainers (instructions)
  • Secrets in the connector's spec are annotated with airbyte_secret
  • Unit & integration tests added and passing. Community members, please provide proof of success locally e.g: screenshot or copy-paste unit, integration, and acceptance test output. To run acceptance tests for a Python connector, follow instructions in the README. For java connectors run ./gradlew :airbyte-integrations:connectors:<name>:integrationTest.
  • Code reviews completed
  • Documentation updated
    • Connector's README.md
    • Connector's bootstrap.md. See description and examples
    • Changelog updated in docs/integrations/<source or destination>/<name>.md including changelog. See changelog example
  • PR name follows PR naming conventions

Airbyter

If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.

  • Create a non-forked branch based on this PR and test the below items on it
  • Build is successful
  • Credentials added to Github CI. Instructions.
  • /test connector=connectors/<name> command is passing.
  • New Connector version released on Dockerhub by running the /publish command described here
  • After the new connector version is published, connector version bumped in the seed directory as described here
  • Seed specs have been re-generated by building the platform and committing the changes to the seed spec files, as described here


This change is Reviewable

@CLAassistant
Copy link

CLAassistant commented Jan 14, 2022

CLA assistant check
All committers have signed the CLA.

@github-actions github-actions bot added area/connectors Connector related issues area/documentation Improvements or additions to documentation labels Jan 14, 2022
@guidoturtu
Copy link
Contributor Author

Signed the CLA

@alafanechere
Copy link
Contributor

alafanechere commented Jan 14, 2022

Thanks for your contribution @guidoturtu, we'll review it shortly. Please make sure the email address you're using to commit is the same as your Github account one, otherwise, CLAssistant cannot match your profile with the contrib(git config --global user.email <your-github-email>), this is why CLA remained as not signed.

@guidoturtu
Copy link
Contributor Author

my mistake @alafanechere . I'm fixing right now! Thanks in advance for the advice

@alafanechere alafanechere mentioned this pull request Jan 14, 2022
@alafanechere alafanechere temporarily deployed to more-secrets January 14, 2022 17:19 Inactive
@octavia-squidington-iii octavia-squidington-iii temporarily deployed to more-secrets January 14, 2022 17:20 Inactive
Copy link
Contributor

@alafanechere alafanechere left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think acceptance tests are not passing because your schema syntax is using swagger syntax instead of the expected JSON syntax. Please make sure ./gradlew :airbyte-integrations:connectors:source-bigcommerce:integrationTest runs successfully and I'll go for a second review.

@guidoturtu
Copy link
Contributor Author

I am not running test with gradlew, is there any difference?

@alafanechere
Copy link
Contributor

alafanechere commented Jan 17, 2022

I am not running test with gradlew, is there any difference?

If you run the acceptance/unit tests with pytest there's no difference about what's being tested.

@alafanechere alafanechere temporarily deployed to more-secrets January 17, 2022 10:55 Inactive
@alafanechere alafanechere self-assigned this Jan 17, 2022
@octavia-squidington-iii octavia-squidington-iii temporarily deployed to more-secrets January 17, 2022 10:55 Inactive
@alafanechere
Copy link
Contributor

I pulled your new commits on my side branch #9521 and one incremental acceptance test is failing: test_state_with_abnormally_large_values. Could you please fix this? Thanks!

@guidoturtu
Copy link
Contributor Author

Thanks @alafanechere from your testing and review. We are having troubles testing also, the original BigCommerce connector. Also pulled from master (the acceptance test give us timeout on 75% of testing). We are trying to figure the problem out, and continue with the develop.

@alafanechere
Copy link
Contributor

alafanechere commented Jan 19, 2022

@guidoturtu did you figure out your testing problem? Let me know if you want to investigate.

@guidoturtu
Copy link
Contributor Author

Hello @alafanechere. Just to make a check. If you run the acceptance testing over master branch (bigcommerce connector source) they all pass without problems (over airbytehq) ?

Thanks in advance for the testing and your commitment

@alafanechere
Copy link
Contributor

Hello @alafanechere. Just to make a check. If you run the acceptance testing over master branch (bigcommerce connector source) they all pass without problems (over airbytehq) ?

Thanks in advance for the testing and your commitment

Yes, the latest ones are running successfully:
https://dnsgjos7lj2fu.cloudfront.net/tests/summary/source-bigcommerce/

@guidoturtu
Copy link
Contributor Author

Well, after a couple of investigations and changes, we realized that the main problem of the failing is a timeout which have origin in a bug that is not taking the date for taking a small batch of data, and is making a full scan of the table. That's why we are having testing problems.

After adding an "incremental patch" for this bug we found another problems for orders and transactions that the schema was not de correct one:

Test session starts (platform: darwin, Python 3.7.7, pytest 6.2.5, pytest-sugar 0.9.4)
cachedir: .pytest_cache
rootdir: /Users/guido/F14/resources/airbyte/git/airbyte_gt, configfile: pytest.ini
plugins: sugar-0.9.4, timeout-1.4.2
collecting ... 
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_match_expected[inputs0] ✓                       6% ▋         
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_required[inputs0] ✓                            12% █▍        
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_optional[inputs0] ✓                            19% █▉        
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_has_secret[inputs0] ✓                          25% ██▌       
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_secret_never_in_the_output[inputs0] ✓          31% ███▎      
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_defined_refs_exist_in_json_spec_file[inputs0] ✓38% ███▊      
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_oauth_flow_parameters[inputs0] ✓               44% ████▍     
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestConnection.test_check[inputs0] ✓                         50% █████     
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestConnection.test_check[inputs1] ✓                         56% █████▋    
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestDiscovery.test_discover[inputs0] ✓                       62% ██████▍   
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestDiscovery.test_defined_cursors_exist_in_schema[inputs0] ✓69% ██████▉   
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestDiscovery.test_defined_refs_exist_in_schema[inputs0] ✓   75% ███████▌  {"type": "LOG", "log": {"level": "ERROR", "message": "\nThe orders stream has the following schema errors:\n has invalid datetime format\n\nFailed validating 'format' in schema['properties']['date_shipped']:\n    {'format': 'date-time', 'type': ['null', 'string']}\n\nOn instance['date_shipped']:\n    ''"}}
{"type": "LOG", "log": {"level": "ERROR", "message": "\nThe transactions stream has the following schema errors:\n857.94 is not of type 'null', 'integer'\n\nFailed validating 'type' in schema['properties']['amount']:\n    {'type': ['null', 'integer']}\n\nOn instance['amount']:\n    857.94"}}


――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――― TestBasicRead.test_read[inputs0] ――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――

self = <source_acceptance_test.tests.test_core.TestBasicRead object at 0x7fbd37839a10>, connector_config = SecretDict(******)
configured_catalog = ConfiguredAirbyteCatalog(streams=[ConfiguredAirbyteStream(stream=AirbyteStream(name='customers', json_schema={'$schema...al'>, cursor_field=['date_modified'], destination_sync_mode=<DestinationSyncMode.append: 'append'>, primary_key=None)])
inputs = BasicReadTestConfig(config_path='secrets/config.json', configured_catalog_path='integration_tests/configured_catalog.j...'transactions', 'orders'}, expect_records=None, validate_schema=True, validate_data_points=False, timeout_seconds=None)
expected_records = [], docker_runner = <source_acceptance_test.utils.connector_runner.ConnectorRunner object at 0x7fbd37766a50>
detailed_logger = <AirbyteNativeLogger detailed_logger acceptance_tests_logs/test_core.py__TestBasicRead__test_read[inputs0].txt (DEBUG)>

    def test_read(
        self,
        connector_config,
        configured_catalog,
        inputs: BasicReadTestConfig,
        expected_records: List[AirbyteMessage],
        docker_runner: ConnectorRunner,
        detailed_logger,
    ):
        output = docker_runner.call_read(connector_config, configured_catalog)
        records = [message.record for message in filter_output(output, Type.RECORD)]
    
        assert records, "At least one record should be read using provided catalog"
    
        if inputs.validate_schema:
>           self._validate_schema(records=records, configured_catalog=configured_catalog)

../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:325: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

records = [AirbyteRecordMessage(stream='customers', data={'id': 839, 'authentication': {'force_password_reset': False}, 'company...sactional_exchange_rate': '1.0000000000', 'custom_status': 'Refunded'}, emitted_at=1642797314914, namespace=None), ...]
configured_catalog = ConfiguredAirbyteCatalog(streams=[ConfiguredAirbyteStream(stream=AirbyteStream(name='customers', json_schema={'$schema...al'>, cursor_field=['date_modified'], destination_sync_mode=<DestinationSyncMode.append: 'append'>, primary_key=None)])

    @staticmethod
    def _validate_schema(records: List[AirbyteRecordMessage], configured_catalog: ConfiguredAirbyteCatalog):
        """
        Check if data type and structure in records matches the one in json_schema of the stream in catalog
        """
        TestBasicRead._validate_records_structure(records, configured_catalog)
        bar = "-" * 80
        streams_errors = verify_records_schema(records, configured_catalog)
        for stream_name, errors in streams_errors.items():
            errors = map(str, errors.values())
            str_errors = f"\n{bar}\n".join(errors)
            logging.error(f"\nThe {stream_name} stream has the following schema errors:\n{str_errors}")
    
        if streams_errors:
>           pytest.fail(f"Please check your json_schema in selected streams {tuple(streams_errors.keys())}.")
E           Failed: Please check your json_schema in selected streams ('orders', 'transactions').

../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:231: Failed
--------------------------------------------------------------------------- Captured log call ---------------------------------------------------------------------------
ERROR    root:test_core.py:228 
The orders stream has the following schema errors:
 has invalid datetime format

Failed validating 'format' in schema['properties']['date_shipped']:
    {'format': 'date-time', 'type': ['null', 'string']}

On instance['date_shipped']:
    ''
ERROR    root:test_core.py:228 
The transactions stream has the following schema errors:
857.94 is not of type 'null', 'integer'

Failed validating 'type' in schema['properties']['amount']:
    {'type': ['null', 'integer']}

On instance['amount']:
    857.94

 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestBasicRead.test_read[inputs0] ⨯                           81% ████████▎ 
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_full_refresh.py::TestFullRefresh.test_sequential_reads[inputs0] ✓     88% ████████▊ 

―――――――――――――――――――――――――――――――――――――――――――――――――――――――――― TestIncremental.test_two_sequential_reads[inputs0] ―――――――――――――――――――――――――――――――――――――――――――――――――――――――――――

self = <source_acceptance_test.tests.test_incremental.TestIncremental object at 0x7fbd37ac6fd0>, connector_config = SecretDict(******)
configured_catalog_for_incremental = ConfiguredAirbyteCatalog(streams=[ConfiguredAirbyteStream(stream=AirbyteStream(name='customers', json_schema={'$schema...al'>, cursor_field=['date_modified'], destination_sync_mode=<DestinationSyncMode.append: 'append'>, primary_key=None)])
cursor_paths = {'customers': ['date_modified'], 'orders': ['date_modified'], 'pages': ['id'], 'products': ['date_modified'], ...}
docker_runner = <source_acceptance_test.utils.connector_runner.ConnectorRunner object at 0x7fbd37bf5290>

    def test_two_sequential_reads(self, connector_config, configured_catalog_for_incremental, cursor_paths, docker_runner: ConnectorRunner):
        stream_mapping = {stream.stream.name: stream for stream in configured_catalog_for_incremental.streams}
    
        output = docker_runner.call_read(connector_config, configured_catalog_for_incremental)
        records_1 = filter_output(output, type_=Type.RECORD)
        states_1 = filter_output(output, type_=Type.STATE)
    
        assert states_1, "Should produce at least one state"
        assert records_1, "Should produce at least one record"
    
        latest_state = states_1[-1].state.data
        for record_value, state_value, stream_name in records_with_state(records_1, latest_state, stream_mapping, cursor_paths):
>           assert (
                record_value <= state_value
            ), f"First incremental sync should produce records younger or equal to cursor value from the state. Stream: {stream_name}"
E           AssertionError: First incremental sync should produce records younger or equal to cursor value from the state. Stream: orders
E           assert 'Tue, 18 Jan 2022 13:41:46 +0000' <= 'Fri, 21 Jan 2022 05:51:46 +0000'

../../bases/source-acceptance-test/source_acceptance_test/tests/test_incremental.py:93: AssertionError

 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_incremental.py::TestIncremental.test_two_sequential_reads[inputs0] ⨯  94% █████████▍
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_incremental.py::TestIncremental.test_state_with_abnormally_large_values[inputs0] ✓100% ██████████
======================================================================== short test summary info ========================================================================
FAILED ../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestBasicRead::test_read[inputs0] - Failed: Please check your json_schema in sele...
FAILED ../../bases/source-acceptance-test/source_acceptance_test/tests/test_incremental.py::TestIncremental::test_two_sequential_reads[inputs0] - AssertionError: Firs...

Results (67.75s):
      14 passed
       2 failed
         - airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:310 TestBasicRead.test_read[inputs0]
         - airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_incremental.py:81 TestIncremental.test_two_sequential_reads[inputs0]

So we fix those 2 schema errors ( following also Marcos recommendations ) and re-run the tests:

Test session starts (platform: darwin, Python 3.7.7, pytest 6.2.5, pytest-sugar 0.9.4)
cachedir: .pytest_cache
rootdir: /Users/guido/F14/resources/airbyte/git/airbyte_gt, configfile: pytest.ini
plugins: sugar-0.9.4, timeout-1.4.2
collecting ... 
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_match_expected[inputs0] ✓                                                                                6% ▋         
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_required[inputs0] ✓                                                                                     12% █▍        
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_optional[inputs0] ✓                                                                                     19% █▉        
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_has_secret[inputs0] ✓                                                                                   25% ██▌       
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_secret_never_in_the_output[inputs0] ✓                                                                   31% ███▎      
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_defined_refs_exist_in_json_spec_file[inputs0] ✓                                                         38% ███▊      
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_oauth_flow_parameters[inputs0] ✓                                                                        44% ████▍     
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestConnection.test_check[inputs0] ✓                                                                                  50% █████     
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestConnection.test_check[inputs1] ✓                                                                                  56% █████▋    
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestDiscovery.test_discover[inputs0] ✓                                                                                62% ██████▍   
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestDiscovery.test_defined_cursors_exist_in_schema[inputs0] ✓                                                         69% ██████▉   
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestDiscovery.test_defined_refs_exist_in_schema[inputs0] ✓                                                            75% ███████▌  
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestBasicRead.test_read[inputs0] ✓                                                                                    81% ████████▎ 
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_full_refresh.py::TestFullRefresh.test_sequential_reads[inputs0] ✓                                                              88% ████████▊ 

――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――― TestIncremental.test_two_sequential_reads[inputs0] ―――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――

self = <source_acceptance_test.tests.test_incremental.TestIncremental object at 0x7ff7a016bbd0>, connector_config = SecretDict(******)
configured_catalog_for_incremental = ConfiguredAirbyteCatalog(streams=[ConfiguredAirbyteStream(stream=AirbyteStream(name='customers', json_schema={'$schema...al'>, cursor_field=['date_modified'], destination_sync_mode=<DestinationSyncMode.append: 'append'>, primary_key=None)])
cursor_paths = {'customers': ['date_modified'], 'orders': ['date_modified'], 'pages': ['id'], 'products': ['date_modified'], ...}
docker_runner = <source_acceptance_test.utils.connector_runner.ConnectorRunner object at 0x7ff7a0107650>

    def test_two_sequential_reads(self, connector_config, configured_catalog_for_incremental, cursor_paths, docker_runner: ConnectorRunner):
        stream_mapping = {stream.stream.name: stream for stream in configured_catalog_for_incremental.streams}
    
        output = docker_runner.call_read(connector_config, configured_catalog_for_incremental)
        records_1 = filter_output(output, type_=Type.RECORD)
        states_1 = filter_output(output, type_=Type.STATE)
    
        assert states_1, "Should produce at least one state"
        assert records_1, "Should produce at least one record"
    
        latest_state = states_1[-1].state.data
        for record_value, state_value, stream_name in records_with_state(records_1, latest_state, stream_mapping, cursor_paths):
>           assert (
                record_value <= state_value
            ), f"First incremental sync should produce records younger or equal to cursor value from the state. Stream: {stream_name}"
E           AssertionError: First incremental sync should produce records younger or equal to cursor value from the state. Stream: orders
E           assert 'Tue, 18 Jan 2022 13:41:46 +0000' <= 'Mon, 24 Jan 2022 17:17:36 +0000'

../../bases/source-acceptance-test/source_acceptance_test/tests/test_incremental.py:93: AssertionError

 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_incremental.py::TestIncremental.test_two_sequential_reads[inputs0] ⨯                                                           94% █████████▍
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_incremental.py::TestIncremental.test_state_with_abnormally_large_values[inputs0] ✓                                            100% ██████████
==================================================================================================== short test summary info =====================================================================================================
FAILED ../../bases/source-acceptance-test/source_acceptance_test/tests/test_incremental.py::TestIncremental::test_two_sequential_reads[inputs0] - AssertionError: First incremental sync should produce records younger or equa...

Results (71.61s):
      15 passed
       1 failed
         - airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_incremental.py:81 TestIncremental.test_two_sequential_reads[inputs0]

The conclusion is that the acceptance test, compare two strings and not two dates ( date-format is not a json schema default), so we remove the validations for orders from configured_catalog. We did not know, there is any chance to override tests for one particular stream?

'Tue, 18 Jan 2022 13:41:46 +0000' <= 'Mon, 24 Jan 2022 17:17:36 +0000' --> here a cast to another date-time format is needed

Test session starts (platform: darwin, Python 3.7.7, pytest 6.2.5, pytest-sugar 0.9.4)
cachedir: .pytest_cache
rootdir: /Users/guido/F14/resources/airbyte/git/airbyte_gt, configfile: pytest.ini
plugins: sugar-0.9.4, timeout-1.4.2
collecting ... 
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_match_expected[inputs0] ✓                                                                                6% ▋         
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_required[inputs0] ✓                                                                                     12% █▍        
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_optional[inputs0] ✓                                                                                     19% █▉        
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_has_secret[inputs0] ✓                                                                                   25% ██▌       
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_secret_never_in_the_output[inputs0] ✓                                                                   31% ███▎      
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_defined_refs_exist_in_json_spec_file[inputs0] ✓                                                         38% ███▊      
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestSpec.test_oauth_flow_parameters[inputs0] ✓                                                                        44% ████▍     
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestConnection.test_check[inputs0] ✓                                                                                  50% █████     
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestConnection.test_check[inputs1] ✓                                                                                  56% █████▋    
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestDiscovery.test_discover[inputs0] ✓                                                                                62% ██████▍   
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestDiscovery.test_defined_cursors_exist_in_schema[inputs0] ✓                                                         69% ██████▉   
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestDiscovery.test_defined_refs_exist_in_schema[inputs0] ✓                                                            75% ███████▌  
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_core.py::TestBasicRead.test_read[inputs0] ✓                                                                                    81% ████████▎ 
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_full_refresh.py::TestFullRefresh.test_sequential_reads[inputs0] ✓                                                              88% ████████▊ 
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_incremental.py::TestIncremental.test_two_sequential_reads[inputs0] ✓                                                           94% █████████▍
 airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_incremental.py::TestIncremental.test_state_with_abnormally_large_values[inputs0] ✓                                            100% ██████████

Results (82.46s):
      16 passed

@guidoturtu
Copy link
Contributor Author

jic, for out local testing we delete orders from catalog_config but do not upload to the PR to avoid change connector structure

@alafanechere
Copy link
Contributor

Thank you @guidoturtu for this investigation. I'll try to reproduce and suggest a workaround.

@alafanechere alafanechere temporarily deployed to more-secrets January 26, 2022 09:36 Inactive
@@ -9,8 +9,7 @@
"type": ["null", "string"]
},
"date_shipped": {
"type": ["null", "string"],
"format": "date-time"
"type": ["null", "string"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@marcosmarxm what would be the impact of this change for normalization? This column won't be considered as a date and this might impact the users of this connector after an upgrade. A more global question: how does the normalization handle changes on the schema?

@alafanechere
Copy link
Contributor

@guidoturtu I think our tests are passing on master because our BigCommerce account does not have any orders or transactions. I'm able to reproduce when adding some orders to our account.

@alafanechere
Copy link
Contributor

@guidoturtu I'm going to implement a custom schema type transformation to properly cast date-shipped to a valid date-time format

@alafanechere alafanechere changed the title Add Catalog Products Stream 🎉 Source BigCommerce: add Products Stream Jan 26, 2022
@octavia-squidington-iii octavia-squidington-iii temporarily deployed to more-secrets January 26, 2022 15:44 Inactive
@alafanechere alafanechere temporarily deployed to more-secrets January 26, 2022 19:37 Inactive
Copy link
Contributor

@alafanechere alafanechere left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like the custom transform function I added fixed the tests.
Going to publish the connector and merge this.

@octavia-squidington-iii octavia-squidington-iii temporarily deployed to more-secrets January 26, 2022 19:41 Inactive
@alafanechere alafanechere temporarily deployed to more-secrets January 26, 2022 20:22 Inactive
@guidoturtu
Copy link
Contributor Author

🚀 ❤️ thanks @alafanechere

@alafanechere alafanechere merged commit 4d48e77 into airbytehq:master Jan 26, 2022
@guidoturtu guidoturtu deleted the source-bigcommerce_product_catalog branch January 26, 2022 21:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/connectors Connector related issues area/documentation Improvements or additions to documentation community user-support-eng
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants