Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Python] Changes to the templates for test code of python client. #4514

Merged
merged 1 commit into from
Jan 22, 2020

Conversation

michelealbano
Copy link
Contributor

@michelealbano michelealbano commented Nov 17, 2019

Generation of test code for the python client, for everything "required":

  • every module has a method to instantiate the class with some default values for all required fields
  • every apis has a method to call it with some default values for all required parameters

I am not sure about the implementation for "binary", but everything else should be fine. Please provide comments and I will improve my contribution.

PR checklist

  • [ X ] Read the contribution guidelines.
  • [ X ] If contributing template-only or documentation-only changes which will change sample output, build the project before.
  • [ X ] Run the shell script(s) under ./bin/ (or Windows batch scripts under.\bin\windows) to update Petstore samples related to your fix. This is important, as CI jobs will verify all generator outputs of your HEAD commit, and these must match the expectations made by your contribution. You only need to run ./bin/{LANG}-petstore.sh, ./bin/openapi3/{LANG}-petstore.sh if updating the code or mustache templates for a language ({LANG}) (e.g. php, ruby, python, etc).
  • [ X ] File the PR against the correct branch: master, 4.3.x, 5.0.x. Default: master.
  • Copy the technical committee to review the pull request if your PR is targeting a particular programming language.

@michelealbano michelealbano changed the title Changes to the templates for test code of python client. [Python] Changes to the templates for test code of python client. Nov 17, 2019
@spacether
Copy link
Contributor

@michelealbano can you copy the members of the python technical committee from here?
https://github.com/OpenAPITools/openapi-generator#62---openapi-generator-technical-committee

@michelealbano
Copy link
Contributor Author

Like this?
@taxpon
@frol
@mbohlool
@cbornet
@kenjones-cisco
@tomplus
@Jyhess
@slash-arun
@spacether

@michelealbano
Copy link
Contributor Author

What should I do to support the revision process of the code?

@@ -27,8 +39,14 @@ class {{#operations}}Test{{classname}}(unittest.TestCase):
{{#summary}}
{{{summary}}} # noqa: E501
{{/summary}}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not use code in here?
https://github.com/OpenAPITools/openapi-generator/blob/master/modules/openapi-generator/src/main/resources/python/api_doc_example.mustache
If you separated api_doc_example.mustache into:

  • api_doc_example_imports.mustache
  • api_doc_example_code.mustache
    Then you could use api_doc_example_code.mustache here
    and use the same mustache file in two places

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice idea. Can you provide an example? I did not work on the documentation up to now.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As you suggested in the comment above, I will focus on the models, and I will correct this in a future PR.

Copy link
Contributor

@spacether spacether left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your PR! Please address the open comment issues.

@@ -5,14 +5,21 @@
from __future__ import absolute_import

import unittest
import datetime
Copy link
Contributor

@spacether spacether Nov 21, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please delete some of the existing model tests and have your template generate new examples. How about replacing the below tests:

  • samples/client/petstore/python/test/test_animal.py (discriminator model)
  • samples/client/petstore/python/test/test_cat.py (composed allof model)
  • samples/client/petstore/python/test/test_enum_class.py (model with enums)
  • samples/client/petstore/python/test/test_format_test.py (normal model with many parameter types)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will have no time for that.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The request up above will take very little time.
To do it one would run:

rm samples/client/petstore/python/test/test_animal.py
rm samples/client/petstore/python/test/test_cat.py
rm samples/client/petstore/python/test/test_enum_class.py
rm samples/client/petstore/python/test/test_format_test.py
bin/python-petstore.sh.sh
# turn on the test server
docker run -d -e SWAGGER_HOST=http://petstore.swagger.io -e SWAGGER_BASE_PATH=/v2 -p 80:8080 swaggerapi/petstore
# test the updated client
cd samples/client/petstore/python
# run the tests
make test

That should take around a minute max.

These api_test and model_test updates will impact all of the tests that are auto generated when the users run the generator.
This pull request does not yet include any tests of this new functionality.
Tests will need to be added showing that both the new model tests and the api client tests produce code which runs and passes tests.
If these tests were omitted this update could breaking the tests produced in the python client. If this is too much work for this PR the model_test update and the api_test update can be broken up in to different PRs.

I am hearing that you are busy. We can wait until you have time to add this test verification. It is very important that new changes do not break the existing tests, so we can go at whatever speed works for you .

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another thing that we can do is look at the content of model_test.mustache in other generators and see how they handle instantiating parameters that have model values. Maybe they have some cool tricks that we can use.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi! I tried to remove and re-create all the existing samples/client/petstore/python/test/*, and the hardest tests to manage are exactly the ones with discriminators, since python and python-experimental produce different signatures. For example, for petstore/models/Cat.py:
python: def __init__(self, declawed=None, local_vars_configuration=None):
python-experimental: def __init__(self, class_name, _check_type=True, _from_server=False, _path_to_item=(), _configuration=None, **kwargs):

The funny thing is that class_name (used in the discriminator) is NOT in the old generator, and it is required (it has no default) in the python-experimental generator.

The only solution I see is to develop a model_test.moustache for the old python, and a model_test_experimental.moustache for the python-experimental.

Any suggestions on alternative solutions?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, I mean when generating the code. For example:

  Tiger:
    type: object
    required:
      - teeth
      - cat
    properties:
      teeth:
        type: string
      cat:
        $ref: '#/definitions/Cat'
  Cat:
    type: object
    required:
      - fur
    properties:
      fur:
        type: string
      animal:
        $ref: '#/definitions/Animal'
  Animal:
    type: object
    required:
      - color
    properties:
      color:
        type: string

What do you do, when generating the make_instance for Tiger? The options are to generate:

            return Tiger(
                teeth = 'many', 
                cat = openapi_client.models.category.Cat(
                fur = 'short',
                animal = openapi_client.models.category.Animal(
                color = 'black and yellor',
            )))

or

            return Tiger(
                teeth = 'many', 
                cat = openapi_client.models.category.Cat(
                fur = 'short',
            ))

Ideally, you would generate the first when include_optional is True and the second then include_optional is False. Anyway, the code for the Cat is generated by toExampleValue, which does NOT know what include_optional is. Thus, internal properties are either generated ALWAYS with all optional parameters, or NEVER with optional parameters.

And now, here is the bomb:

  TagList:
    type: object
    required:
      - name
    properties:
      name:
        type: string
      nextTag:
        $ref: '#/definitions/TagList'

It is a linked list, and it should have an empty nextTag sooner or later. Anyway, if we always comprise optional parameters in toExampleValue, we will have infinite recursion.
So, we should NEVER add optional parameters at toExampleValue level. Do you agree?

Copy link
Contributor

@spacether spacether Jan 18, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So I see two solutions to these problems.
For the problem of needing to pick whether optional is required or not
Create a string property in CodegenProperty called exampleOptional. In it store the example where include_optional is true. In example store the example where include_optional is false

For the problem of a model with params or items of type model, you can include a list of Schemas or dataTypes that you are inspecting in your toExample function. Every time you call it recursively, add a schema. If we are going to add a schema that we have already added, add it then return, which will end our recursive calls.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The first solution implies changing the data model, and I prefer to leave it to the next developer.

I have implement the second solution.

Copy link
Contributor

@spacether spacether Jan 19, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am hearing that you don't want to edit the CodegenProperty.java class.
For the first solution we can support deep include_optional=True without changing the data model.
This would work by storing the exampleOptional string value in a property that we are not using in CodegenProperty.
Why not store that value in the parameter.unescapedDescription property and add a comment describing what we are storing in it?
If this is something that you still don't want to do that's fine, just let me know. I just want you to know that there is another viable solution here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, I would prefer to have this PR accepted, then do something for the api, then I could come back to this issue - after discussing it with you in a chat.

@michelealbano
Copy link
Contributor Author

Hi, I cannot understand why the tests fail. Can you help?

@spacether
Copy link
Contributor

Hi, I cannot understand why the tests fail. Can you help?

So one test is failing because of some go issue.
This should not be happening, a rebase on master may fix this.

Another test is failing because of a connexion library issue.
Rebasing your branch on master will definitely solve that problem.

Can you rebase your branch on the master branch?

@jimschubert
Copy link
Member

Could not GET 'http://repo1.maven.org/maven2/io/github/http-builder-ng/http-builder-ng-core/1.0.3/http-builder-ng-core-1.0.3.pom'. Received status code 501 from server: HTTPS Required

This submitted PR attempts to fix it: #5045

#5045 only changes maven repo declarations to match what is documented on Gradle's site. Current master contains all fixes for the maven http issues. If you merge master into this branch, those issues should be gone.

@michelealbano michelealbano force-pushed the pythonclient_tests branch 3 times, most recently from bd73895 to 2ce20b2 Compare January 20, 2020 03:12
@michelealbano
Copy link
Contributor Author

Ok. Problem with version control. Done

Copy link
Contributor

@spacether spacether left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@michelealbano the http url needs to be reverted and the init file removal in pythonclient needs to be reverted. You mentioned that you wouldn't be available for a bit. I could

  • wait for you to make the fixes amd approve after or
  • you can add me as a contributor and I can make the update and request an approval from the core team when you are busy

Either works. Please let me know what you prefer.

@michelealbano
Copy link
Contributor Author

Please elaborate on #4514 (review)

  1. the https->http in the comment was already reverted. Which other one do you need?
  2. I will revert back to having the test package. I can do it in a few minutes, but you suggested not to have it. Which one is it?

Copy link
Contributor

@spacether spacether left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@michelealbano I added comments in the specific remaining locations

Tests for python client, comprising support for additional_properties and arrays.
There are ugly workarounds for when there are discriminators, since the python client generator does not fully handle them.
Cool indentation of test files.
Copy link
Contributor

@spacether spacether left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@michelealbano thank you for all of your hard work on this!
This looks good.

@michelealbano
Copy link
Contributor Author

Is there anything else that needs to be done to end this PR?

@wing328 wing328 merged commit bf57a99 into OpenAPITools:master Jan 22, 2020
@wing328
Copy link
Member

wing328 commented Jan 22, 2020

@michelealbano the PR looks good to me and I've just merged it.

@spacether thanks for reviewing the change.

@wing328 wing328 added this to the 4.2.3 milestone Jan 22, 2020
@michelealbano michelealbano deleted the pythonclient_tests branch January 22, 2020 15:36
@jimschubert
Copy link
Member

Just a heads up, this change may have caused a regression in master. Once we have more details, the change may need to be reverted and re-opened against the 4.3.x branch.

I'd recommend future changes for this work, regardless of whether it's temporarily reverted:

  • Wrap toExampleValueRecursive in a try/catch as examples should not cause exceptions or block generation
  • Use examples available in the spec and instantiation types defined in the generator rather than hard-coding examples
  • Write some tests for toExampleValueRecursive

cc @spacether

@spacether
Copy link
Contributor

spacether commented Jan 22, 2020

@jimschubert this is a PR on the python generator.
Would the new release be the latest minor version because:

  • it's a non-breaking change

@jimschubert
Copy link
Member

jimschubert commented Jan 22, 2020

@spacether it Looks like a NPE which would've probably been caught if the new method was unit tested.

I've opened #5082, but won't have time to verify it until probably later tonight.

@spacether
Copy link
Contributor

Good point

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants