From f376bd677d4d606b8bdeac80abce94d0cc13be4b Mon Sep 17 00:00:00 2001 From: seem Date: Sun, 14 Aug 2022 16:40:21 +1000 Subject: [PATCH] revert remove `nbs/17_release.ipynb` --- nbs/17_release.ipynb | 769 +++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 769 insertions(+) create mode 100644 nbs/17_release.ipynb diff --git a/nbs/17_release.ipynb b/nbs/17_release.ipynb new file mode 100644 index 000000000..a30c9453f --- /dev/null +++ b/nbs/17_release.ipynb @@ -0,0 +1,769 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| default_exp release" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# release\n", + "\n", + "> Auto-generated tagged releases and release notes from GitHub issues" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Overview" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`nbdev.release` provides 3 commands that you can run from your shell to manage your changelog file and git releases:\n", + "\n", + "- `nbdev_changelog`: creates a CHANGELOG.md file from closed and labeled GitHub issues\n", + "- `nbdev_release_git`: tags and creates a release in GitHub for the current version\n", + "- `nbdev_release_gh`: calls `nbdev_changelog`, lets you edit the result, then pushes to git and calls `nbdev_release_git`\n", + "\n", + "It provides 3 futher commands for releasing packages on pypi or conda:\n", + "\n", + "- `nbdev_pypi`: Create and upload a pypi installer\n", + "- `nbdev_conda`: Create and upload a conda installer\n", + "- `nbdev_release_both`: Create and upload both pypi and conda installers\n", + "\n", + "Here's a brief demonstration of how to use the changelog and git release tools in `nbdev.release`. This demo first creates an issue using the [`gh`](https://cli.github.com/) command line tool, and then closes it using `git`; you can also use GitHub's web interface for both of these tasks. (Note that this functionality used to be in a project called `fastrelease`, so in the video the command line tools have different names, starting with `fastrelease_` instead of `nbdev_`)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Setup" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You'll need to get a GitHub [personal access token](https://docs.github.com/en/github/authenticating-to-github/creating-a-personal-access-token) if you haven't already. To do so, [click here](https://github.com/settings/tokens/new) and enter \"nbdev\" in the \"Note\" section, and click the `repo` checkbox.\n", + "\n", + "Then click \"Generate Token\" at the bottom of the screen, and copy the token (the long string of letters and numbers shown). You can easily do that by clicking the little clipboard icon next to the token.\n", + "\n", + "\"Copying\n", + "\n", + "Paste that token into a file called `token` into the root of your repo. You can run the following in your terminal (`cd` to the root of your repo first) to create that file:\n", + "\n", + " echo XXX > token\n", + "\n", + "Replace *XXX* above with the token you copied. Also, ensure that this file isn't added to git, by running this in your terminal:\n", + "\n", + " echo token >> .gitignore" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Creating release notes" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now you're ready to create your release notes. These are created in a file called `CHANGELOG.md`. Here's an example of what it creates: [nbdev CHANGELOG](https://github.com/fastai/nbdev/blob/master/CHANGELOG.md).\n", + "\n", + "All issues with the label **bug**, **enhancement**, or **breaking** that have been closed in your repo since your last release will be added to the top of this file. If you haven't made any releases before, then all issues with those labels will be included.\n", + "\n", + "Therefore, before you create or update `CHANGELOG.md`, go to your GitHub issues page, remove `is:open` from the filter, and label any issues you want included with one of the labels above. When you've done that, you can create or update your release notes by running in your terminal:\n", + "\n", + " nbdev_changelog\n", + "\n", + "The titles and bodies of each issue will be added. Open `CHANGELOG.md` in your editor and make any edits that you want, and then commit the file to your repo (remember to `git add` it!)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Tagging a release" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You should now tag a release. This will create a tag in GitHub with your current version number in `settings.ini`, and will then make it into a release, using your latest release notes as the description of the release:\n", + "\n", + " nbdev_release_git\n", + "\n", + "After you run this, be sure to increment your version number in `settings.ini`. You can either edit it manually, or if you use nbdev it can be done for you by running:\n", + "\n", + " nbdev_bump_version" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Doing both (creating release notes, and tagging a release)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To complete both of the steps above, run:\n", + "\n", + "```\n", + "nbdev_release_gh\n", + "```\n", + "\n", + "See the screencast above for a demonstration of this." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Python API" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "from fastcore.all import *\n", + "from ghapi.core import *\n", + "\n", + "from datetime import datetime\n", + "from configparser import ConfigParser\n", + "import shutil,subprocess" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| hide\n", + "from nbdev.showdoc import show_doc" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "GH_HOST = \"https://api.github.com\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "def _find_config(cfg_name=\"settings.ini\"):\n", + " cfg_path = Path().absolute()\n", + " while cfg_path != cfg_path.parent and not (cfg_path/cfg_name).exists(): cfg_path = cfg_path.parent\n", + " return Config(cfg_path, cfg_name)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "def _issue_txt(issue):\n", + " res = '- {} ([#{}]({}))'.format(issue.title.strip(), issue.number, issue.html_url)\n", + " if hasattr(issue, 'pull_request'): res += ', thanks to [@{}]({})'.format(issue.user.login, issue.user.html_url)\n", + " res += '\\n'\n", + " if not issue.body: return res\n", + " return res + f\" - {issue.body.strip()}\\n\"\n", + "\n", + "def _issues_txt(iss, label):\n", + " if not iss: return ''\n", + " res = f\"### {label}\\n\\n\"\n", + " return res + '\\n'.join(map(_issue_txt, iss))\n", + "\n", + "def _load_json(cfg, k):\n", + " try: return json.loads(cfg[k])\n", + " except json.JSONDecodeError as e: raise Exception(f\"Key: `{k}` in .ini file is not a valid JSON string: {e}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Release -" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "class Release:\n", + " def __init__(self, owner=None, repo=None, token=None, **groups):\n", + " \"Create CHANGELOG.md from GitHub issues\"\n", + " self.cfg = _find_config()\n", + " self.changefile = self.cfg.config_path/'CHANGELOG.md'\n", + " if not groups:\n", + " default_groups=dict(breaking=\"Breaking Changes\", enhancement=\"New Features\", bug=\"Bugs Squashed\")\n", + " groups=_load_json(self.cfg, 'label_groups') if 'label_groups' in self.cfg else default_groups\n", + " os.chdir(self.cfg.config_path)\n", + " owner,repo = owner or self.cfg.user, repo or self.cfg.lib_name\n", + " token = ifnone(token, os.getenv('NBDEV_TOKEN',None))\n", + " if not token and Path('token').exists(): token = Path('token').read_text().strip()\n", + " if not token: raise Exception('Failed to find token')\n", + " self.gh = GhApi(owner, repo, token)\n", + " self.groups = groups\n", + "\n", + " def _issues(self, label):\n", + " return self.gh.issues.list_for_repo(state='closed', sort='created', filter='all', since=self.commit_date, labels=label)\n", + " def _issue_groups(self): return parallel(self._issues, self.groups.keys(), progress=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To create a markdown changelog, first create a `Release` object, optionally passing a mapping from GitHub labels to markdown titles. Put your github token in a file named `token` at the root of your repo. `Release` attempts to fetch values for arguments from the following locations if not supplied:\n", + "\n", + "- **owner:** fetched from the field `user` in `settings.ini`. This is the owner name of the repository on GitHub. For example for the repo `fastai/fastcore` the owner would be `fastai`.\n", + "- **repo:** fetched from the field `lib_name` in `settings.ini`. This is the name of the repository on GitHub. For example for the repo `fastai/fastcore` the owner would be `fastcore`.\n", + "- **token:** fetched from a file named `token` at the root of your repo. Creating a token is discussed in [the setup](https://fastrelease.fast.ai/#Set-up) section.\n", + "- **groups:** (optional) fetched from the field `label_groups` in `settings.ini`, which is a JSON string. This is a mapping from label names to titles in your release notes. If not specified, this defaults to:\n", + "\n", + "```python\n", + "{\"breaking\": \"Breaking Changes\", \"enhancement\":\"New Features\", \"bug\":\"Bugs Squashed\"}\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#|export\n", + "@patch\n", + "def changelog(self:Release,\n", + " debug=False): # Just print the latest changes, instead of updating file\n", + " \"Create the CHANGELOG.md file, or return the proposed text if `debug` is `True`\"\n", + " if not self.changefile.exists(): self.changefile.write_text(\"# Release notes\\n\\n\\n\")\n", + " marker = '\\n'\n", + " try: self.commit_date = self.gh.repos.get_latest_release().published_at\n", + " except HTTP404NotFoundError: self.commit_date = '2000-01-01T00:00:004Z'\n", + " res = f\"\\n## {self.cfg.version}\\n\"\n", + " issues = self._issue_groups()\n", + " res += '\\n'.join(_issues_txt(*o) for o in zip(issues, self.groups.values()))\n", + " if debug: return res\n", + " res = self.changefile.read_text().replace(marker, marker+res+\"\\n\")\n", + " shutil.copy(self.changefile, self.changefile.with_suffix(\".bak\"))\n", + " self.changefile.write_text(res)\n", + " run(f'git add {self.changefile}')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| eval: false\n", + "rel = Release()\n", + "# print(rel.changelog(debug=True))" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#|export\n", + "@patch\n", + "def release(self:Release):\n", + " \"Tag and create a release in GitHub for the current version\"\n", + " ver = self.cfg.version\n", + " notes = self.latest_notes()\n", + " self.gh.create_release(ver, branch=self.cfg.branch, body=notes)\n", + " return ver" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This uses the version information from your `settings.ini`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#|export\n", + "@patch\n", + "def latest_notes(self:Release):\n", + " \"Latest CHANGELOG entry\"\n", + " if not self.changefile.exists(): return ''\n", + " its = re.split(r'^## ', self.changefile.read_text(), flags=re.MULTILINE)\n", + " if not len(its)>0: return ''\n", + " return '\\n'.join(its[1].splitlines()[1:]).strip()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "All relevant pull requests and issues are fetched from the GitHub API, and are categorized according to a user-supplied mapping from labels to markdown headings." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## CLI functions" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "@call_parse\n", + "def changelog(\n", + " debug:store_true=False, # Print info to be added to CHANGELOG, instead of updating file\n", + " repo:str=None, # repo to use instead of `lib_name` from `settings.ini`\n", + "):\n", + " \"Create a CHANGELOG.md file from closed and labeled GitHub issues\"\n", + " res = Release(repo=repo).changelog(debug=debug)\n", + " if debug: print(res)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "@call_parse\n", + "def release_git(\n", + " token:str=None # Optional GitHub token (otherwise `token` file is used)\n", + "):\n", + " \"Tag and create a release in GitHub for the current version\"\n", + " ver = Release(token=token).release()\n", + " print(f\"Released {ver}\")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "@call_parse\n", + "def release_gh(\n", + " token:str=None # Optional GitHub token (otherwise `token` file is used)\n", + "):\n", + " \"Calls `nbdev_changelog`, lets you edit the result, then pushes to git and calls `nbdev_release_git`\"\n", + " cfg = _find_config()\n", + " Release().changelog()\n", + " subprocess.run([os.environ.get('EDITOR','nano'), cfg.config_path/'CHANGELOG.md'])\n", + " if not input(\"Make release now? (y/n) \").lower().startswith('y'): sys.exit(1)\n", + " run('git commit -am release')\n", + " run('git push')\n", + " ver = Release(token=token).release()\n", + " print(f\"Released {ver}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Publish Packages" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "from fastcore.all import *\n", + "from nbdev.read import *\n", + "from nbdev.cli import *\n", + "\n", + "import yaml,subprocess,glob,platform\n", + "from copy import deepcopy\n", + "from os import system\n", + "try: from packaging.version import parse\n", + "except ImportError: from pip._vendor.packaging.version import parse\n", + "\n", + "_PYPI_URL = 'https://pypi.org/pypi/'" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "def pypi_json(s):\n", + " \"Dictionary decoded JSON for PYPI path `s`\"\n", + " return urljson(f'{_PYPI_URL}{s}/json')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "def latest_pypi(name):\n", + " \"Latest version of `name` on pypi\"\n", + " return max(parse(r) for r,o in pypi_json(name)['releases'].items()\n", + " if not parse(r).is_prerelease and not nested_idx(o, 0, 'yanked'))" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "def pypi_details(name):\n", + " \"Version, URL, and SHA256 for `name` from pypi\"\n", + " ver = str(latest_pypi(name))\n", + " pypi = pypi_json(f'{name}/{ver}')\n", + " info = pypi['info']\n", + " rel = [o for o in pypi['urls'] if o['packagetype']=='sdist'][0]\n", + " return ver,rel['url'],rel['digests']['sha256']" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "import shlex\n", + "from subprocess import Popen, PIPE, CalledProcessError\n", + "\n", + "def _run(cmd):\n", + " res = \"\"\n", + " with Popen(shlex.split(cmd), stdout=PIPE, bufsize=1, text=True) as p:\n", + " for line in p.stdout:\n", + " print(line, end='')\n", + " res += line\n", + "\n", + " if p.returncode != 0: raise CalledProcessError(p.returncode, p.args)\n", + " return res" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "def conda_output_path(name):\n", + " \"Output path for conda build\"\n", + " return run(f'conda build --output {name}').strip().replace('\\\\', '/')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "def _write_yaml(path, name, d1, d2):\n", + " path = Path(path)\n", + " p = path/name\n", + " p.mkdir(exist_ok=True, parents=True)\n", + " yaml.SafeDumper.ignore_aliases = lambda *args : True\n", + " with (p/'meta.yaml').open('w') as f:\n", + " yaml.safe_dump(d1, f)\n", + " yaml.safe_dump(d2, f)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "def _get_conda_meta():\n", + " cfg = get_config()\n", + " name,ver = cfg.lib_name,cfg.version\n", + " url = cfg.doc_host or cfg.git_url\n", + "\n", + " reqs = ['pip', 'python', 'packaging']\n", + " if cfg.get('requirements'): reqs += cfg.requirements.split()\n", + " if cfg.get('conda_requirements'): reqs += cfg.conda_requirements.split()\n", + "\n", + " pypi = pypi_json(f'{name}/{ver}')\n", + " rel = [o for o in pypi['urls'] if o['packagetype']=='sdist'][0]\n", + "\n", + " # Work around conda build bug - 'package' and 'source' must be first\n", + " d1 = {\n", + " 'package': {'name': name, 'version': ver},\n", + " 'source': {'url':rel['url'], 'sha256':rel['digests']['sha256']}\n", + " }\n", + "\n", + " _dir = cfg.path('lib_path').parent\n", + " readme = _dir/'README.md'\n", + " descr = readme.read_text() if readme.exists() else ''\n", + " d2 = {\n", + " 'build': {'number': '0', 'noarch': 'python',\n", + " 'script': '{{ PYTHON }} -m pip install . -vv'},\n", + " 'requirements': {'host':reqs, 'run':reqs},\n", + " 'test': {'imports': [cfg.get('lib_path')]},\n", + " 'about': {\n", + " 'license': 'Apache Software',\n", + " 'license_family': 'APACHE',\n", + " 'home': url, 'doc_url': url, 'dev_url': url,\n", + " 'summary': cfg.get('description'),\n", + " 'description': descr\n", + " },\n", + " 'extra': {'recipe-maintainers': [cfg.get('user')]}\n", + " }\n", + " return name,d1,d2" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "def write_conda_meta(path='conda'):\n", + " \"Writes a `meta.yaml` file to the `conda` directory of the current directory\"\n", + " _write_yaml(path, *_get_conda_meta())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This function is used in the `conda_package` CLI command.\n", + "\n", + "**NB**: you need to first of all upload your package to PyPi, before creating the conda package." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "def anaconda_upload(name, loc=None, user=None, token=None, env_token=None):\n", + " \"Upload `name` to anaconda\"\n", + " user = f'-u {user} ' if user else ''\n", + " if env_token: token = os.getenv(env_token)\n", + " token = f'-t {token} ' if token else ''\n", + " if not loc: loc = conda_output_path(name)\n", + " if not loc: raise Exception(\"Failed to find output\")\n", + " return _run(f'anaconda {token} upload {user} {loc} --skip-existing')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "@call_parse\n", + "def release_conda(\n", + " path:str='conda', # Path where package will be created\n", + " do_build:bool_arg=True, # Run `conda build` step\n", + " build_args:str='', # Additional args (as str) to send to `conda build`\n", + " skip_upload:store_true=False, # Skip `anaconda upload` step\n", + " mambabuild:store_true=False, # Use `mambabuild` (requires `boa`)\n", + " upload_user:str=None # Optional user to upload package to\n", + "):\n", + " \"Create a `meta.yaml` file ready to be built into a package, and optionally build and upload it\"\n", + " name = config_key('lib_name', path=False)\n", + " write_conda_meta(path)\n", + " out = f\"Done. Next steps:\\n```\\ncd {path}\\n\"\"\"\n", + " os.chdir(path)\n", + " loc = conda_output_path(name)\n", + " out_upl = f\"anaconda upload {loc}\"\n", + " build = 'mambabuild' if mambabuild else 'build'\n", + " if not do_build: return print(f\"{out}conda {build} .\\n{out_upl}\\n```\")\n", + "\n", + " print(f\"conda {build} --no-anaconda-upload {build_args} {name}\")\n", + " res = _run(f\"conda {build} --no-anaconda-upload {build_args} {name}\")\n", + " if skip_upload: return\n", + " if not upload_user: upload_user = config_key('conda_user', path=False)\n", + " if not upload_user: return print(\"`conda_user` not in settings.ini and no `upload_user` passed. Cannot upload\")\n", + " if 'anaconda upload' not in res: return print(f\"{res}\\n\\Failed. Check auto-upload not set in .condarc. Try `--do_build False`.\")\n", + " return anaconda_upload(name, loc)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| export\n", + "def chk_conda_rel(\n", + " nm:str, # Package name on pypi\n", + " apkg:str=None, # Anaconda Package (defaults to {nm})\n", + " channel:str='fastai', # Anaconda Channel\n", + " force:store_true=False # Always return github tag\n", + "):\n", + " \"Prints GitHub tag only if a newer release exists on Pypi compared to an Anaconda Repo.\"\n", + " if not apkg: apkg=nm\n", + " condavs = L(loads(run(f'mamba repoquery search {apkg} -c {channel} --json'))['result']['pkgs'])\n", + " condatag = condavs.attrgot('version').map(parse)\n", + " pypitag = latest_pypi(nm)\n", + " if force or not condatag or pypitag > max(condatag): return f'{pypitag}'" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To build and upload a conda package, cd to the root of your repo, and then:\n", + "\n", + " nbdev_conda_package\n", + "\n", + "Or to do things more manually:\n", + "\n", + "```\n", + "nbdev_conda_package --do_build false\n", + "cd conda\n", + "conda build --no-anaconda-upload --output-folder build {name}\n", + "anaconda upload build/noarch/{name}-{ver}-*.tar.bz2\n", + "```\n", + "\n", + "Add `--debug` to the `conda build command` to debug any problems that occur. Note that the build step takes a few minutes. Add `-u {org_name}` to the `anaconda upload` command if you wish to upload to an organization, or pass `upload_user` to `nbdev_conda_package`.\n", + "\n", + "**NB**: you need to first of all upload your package to PyPi, before creating the conda package." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#|export\n", + "@call_parse\n", + "def release_pypi(\n", + " repository:str=\"pypi\" # Respository to upload to (defined in ~/.pypirc)\n", + "):\n", + " \"Create and upload Python package to PyPI\"\n", + " _dir = config_key(\"lib_path\").parent\n", + " system(f'cd {_dir} && rm -rf dist && python setup.py sdist bdist_wheel')\n", + " system(f'twine upload --repository {repository} {_dir}/dist/*')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#|export\n", + "@call_parse\n", + "def release_both(\n", + " path:str='conda', # Path where package will be created\n", + " do_build:bool_arg=True, # Run `conda build` step\n", + " build_args:str='', # Additional args (as str) to send to `conda build`\n", + " skip_upload:store_true=False, # Skip `anaconda upload` step\n", + " mambabuild:store_true=False, # Use `mambabuild` (requires `boa`)\n", + " upload_user:str=None, # Optional user to upload package to\n", + " repository:str=\"pypi\" # Pypi respository to upload to (defined in ~/.pypirc)\n", + "):\n", + " \"Release both conda and PyPI packages\"\n", + " release_pypi.__wrapped__(repository)\n", + " release_conda.__wrapped__(path, do_build=do_build, build_args=build_args, skip_upload=skip_upload, mambabuild=mambabuild, upload_user=upload_user)\n", + " nbdev_bump_version.__wrapped__()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Export -" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#| hide\n", + "import nbdev; nbdev.nbdev_export()" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +}