Skip to content

Commit a27ab8f

Browse files
committed
fixing github workflows, updating .gitignore and pre-commit config
1 parent 830a51e commit a27ab8f

14 files changed

+4208
-23
lines changed

.github/workflows/pylint.yaml

+3-2
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,8 @@ jobs:
1717
- name: Install dependencies
1818
run: |
1919
python -m pip install --upgrade pip
20-
pip install pylint
20+
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
2121
- name: Analysing the code with pylint
2222
run: |
23-
pylint $(git ls-files '*.py') --rcfile=.pylintrc
23+
pip install pylint
24+
pylint $(git ls-files '*.py') --rcfile=.pylintrc

.github/workflows/test_on_push.yaml .github/workflows/test.yaml

+3-7
Original file line numberDiff line numberDiff line change
@@ -10,19 +10,15 @@ jobs:
1010
python-version: ["3.10"]
1111

1212
steps:
13-
- uses: actions/checkout@v3
13+
- uses: actions/checkout@v4
1414
- name: Set up Python ${{ matrix.python-version }}
15-
uses: actions/setup-python@v4
15+
uses: actions/setup-python@v3
1616
with:
1717
python-version: ${{ matrix.python-version }}
1818
- name: Install dependencies
1919
run: |
2020
python -m pip install --upgrade pip
2121
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
22-
- name: Python Black
23-
run: |
24-
pip install black==24.10.0
25-
black . --check
2622
- name: Test with pytest
2723
run: |
28-
pytest --cov-report html:./results/cov_html --cov=src tests/
24+
pytest --cov-report html:./results/cov_html tests/

.gitignore

+1-1
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ logs/
22
ipynb_checkpoints/
33
mlruns
44
mlartifacts
5-
*.csv
5+
# *.csv
66

77
# Byte-compiled / optimized / DLL files
88
__pycache__/

.pre-commit-config.yaml

-1
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,6 @@ repos:
66
- id: check-yaml
77
- id: end-of-file-fixer
88
- id: trailing-whitespace
9-
- id: check-added-large-files
109
- id: debug-statements
1110
language_version: python3
1211

README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -61,4 +61,4 @@ dvc pull
6161
Reproduces the pipeline using DVC
6262
```
6363
dvc repro
64-
```
64+
```

config/metadata.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
author: Rafael Greca Vieira
22
model_type: scikit-learn_0.23
33
project_name: e2e-mlops-project
4-
project_version: v0.1
4+
project_version: v0.1

config/settings.yaml

+5-5
Original file line numberDiff line numberDiff line change
@@ -7,11 +7,11 @@ EXPERIMENT_ID: '708342276728582022'
77
VERSION: '1.0'
88

99
# GENERAL SETTINGS
10-
DATA_PATH: '/media/greca/HD/GitHub/e2e-mlops-project/data/'
10+
DATA_PATH: './data/'
1111
RAW_FILE_NAME: 'Original_ObesityDataSet.csv'
12-
ARTIFACTS_PATH: '/media/greca/HD/GitHub/e2e-mlops-project/models/artifacts/'
13-
FEATURES_PATH: '/media/greca/HD/GitHub/e2e-mlops-project/models/features/'
14-
RESEARCH_ENVIRONMENT_PATH: '/media/greca/HD/GitHub/e2e-mlops-project/notebooks/'
12+
ARTIFACTS_PATH: './models/artifacts/'
13+
FEATURES_PATH: './models/features/'
14+
RESEARCH_ENVIRONMENT_PATH: './notebooks/'
1515
TARGET_COLUMN: 'NObeyesdad'
1616
LOG_LEVEL: 'INFO'
17-
LOG_PATH: '/media/greca/HD/GitHub/e2e-mlops-project/'
17+
LOG_PATH: './'

data/Original_ObesityDataSet.csv

+2,112
Large diffs are not rendered by default.

data/Preprocessed_ObesityDataSet.csv

+2,077
Large diffs are not rendered by default.

models/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -2,4 +2,4 @@
22

33
The models and artifacts will not be stored locally but rather in an AWS S3 Bucket to simulate a real-world scenario where models will have different versions (model versioning).
44

5-
This folder will be used temporarily to save the models and artifacts locally and then transfer it to your AWS S3 bucket. After that, the files will be deleted. If you choose to not use an AWS S3 Bucket and an AWS RDS Databaset, then the `artifacts` and the `features` will be stored into the `models` folder.
5+
This folder will be used temporarily to save the models and artifacts locally and then transfer it to your AWS S3 bucket. After that, the files will be deleted. If you choose to not use an AWS S3 Bucket and an AWS RDS Databaset, then the `artifacts` and the `features` will be stored into the `models` folder.

requirements.txt

+3-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,8 @@
1-
boto3==1.24.28
1+
awscli==1.34.0
2+
boto3==1.35.0
23
fastapi==0.115.5
34
joblib==1.3.2
5+
kaggle==1.6.17
46
loguru==0.7.2
57
mlflow==2.15.1
68
numpy==1.21.5

results/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
# Results
22

3-
Here goes the results of evaluates, like metrics, graphs or others.
3+
Here goes the results of evaluates, like metrics, graphs or others.

tests/integration/test_model_inference.py

-1
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,6 @@ def test_model_inference_pipeline() -> None:
4747
assert predictions.shape[0] == features.shape[0]
4848
assert isinstance(predictions.dtype, type(np.dtype("float64")))
4949

50-
# FIXME: fix this
5150
# predictions = loaded_model.predict(x, transform_to_str=True)
5251

5352
# assert isinstance(predictions, List)

tests/unit/test_model_functions.py

-1
Original file line numberDiff line numberDiff line change
@@ -89,7 +89,6 @@ def test_prediction() -> None:
8989
# Unit case to test the model performance on training and validation sets
9090
# (making sure that are the same values as mentioned in MLflow's UI).
9191
# """
92-
# # FIXME: fix this
9392
# indexes = [FEATURES_NAME.index(i) for i in model_settings.FEATURES]
9493

9594
# loaded_model = ModelServe(

0 commit comments

Comments
 (0)