Skip to content

Commit

Permalink
merge dev into epss score
Browse files Browse the repository at this point in the history
* Update versions in application files

* Update versions

* Parse GitHub vulnerability version (DefectDojo#9462)

* Fix SARIF parser with CodeQL rules (DefectDojo#9440)

* fix for sarif parser with codeql rules

* add check for extensions property

* flake8 comparsion

* finding sla expiration date field (part two) (DefectDojo#9494)

* finding sla expiration date field (part two)

* sla violation check updates

* clean up of finding violates_sla property

* flake8 fix

* Update dojo/models.py

Co-authored-by: Charles Neill <1749665+cneill@users.noreply.github.com>

* Update 0201_populate_finding_sla_expiration_date.py

---------

Co-authored-by: Charles Neill <1749665+cneill@users.noreply.github.com>

* Jira Server/DataCenter: Update meta methods (DefectDojo#9512)

* Jira Webhook: Catch comments from other issue updates (DefectDojo#9513)

* Jira Webhook: Catch comments from other issue updates

* Accommodate redirect responses

* Update dojo/jira_link/views.py

Co-authored-by: Charles Neill <1749665+cneill@users.noreply.github.com>

* Fix syntax

---------

Co-authored-by: Charles Neill <1749665+cneill@users.noreply.github.com>

* add metrics page: "Product Tag Count" (fixes DefectDojo#9151) (DefectDojo#9152)

* add metrics page: "Product Tag Count"

It is fully based on "Product Type Count" metrics page.

* fixup! add metrics page: "Product Tag Count"

* Fix Flake8

* Update views.py

---------

Co-authored-by: Cody Maffucci <46459665+Maffooch@users.noreply.github.com>

* Release Drafter: Try validating inputs

* Disallow duplicate tool types (DefectDojo#9530)

* Disallow duplicate tool types

* Fix Flake8

* Only validate on new creations

* Force new name on tool type unit test

* Engagement Surveys: Add missing leading slash (DefectDojo#9531)

URL redirects were behaving strangely without this leading slash. it seems it was missed when all the others were added

* Update versions in application files

* Update versions in application files

* Dojo_Group: Support for "RemoteUser" in model (DefectDojo#9405)

* Use correct name references

* fix db_mig

* Update and rename 0201_alter_dojo_group_social_provider.py to 0202_alter_dojo_group_social_provider.py

---------

Co-authored-by: Cody Maffucci <46459665+Maffooch@users.noreply.github.com>

* Update rabbitmq:3.12.12-alpine Docker digest from 3.12.12 to 3.12.12-alpine (docker-compose.yml) (DefectDojo#9535)

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>

* remove flot-axis library (DefectDojo#9540)

* use full url for helm-repos and alias in renovate.json (DefectDojo#9525)

With this change, renovate will create PRs to update
the helm-dependencies, just as with docker-compose.

Note that only setting the repository to the full URL did not work,
I also had to add the registryAlias.

* Update Helm release redis from 16.12.3 to ~16.13.0 (helm/defectdojo/Chart.yaml) (DefectDojo#9550)

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>

* Update rabbitmq:3.12.12-alpine Docker digest from 3.12.12 to 3.12.12-alpine (docker-compose.yml) (DefectDojo#9541)

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>

* Update postgres Docker tag from 16.1 to v16.2 (docker-compose.yml) (DefectDojo#9536)

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>

* Update Helm release mysql from 9.1.8 to ~9.19.0 (helm/defectdojo/Chart.yaml) (DefectDojo#9545)

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>

---------

Co-authored-by: DefectDojo release bot <dojo-release-bot@users.noreply.github.com>
Co-authored-by: Cody Maffucci <46459665+Maffooch@users.noreply.github.com>
Co-authored-by: Colm O hEigeartaigh <coheigea@users.noreply.github.com>
Co-authored-by: Andrei Serebriakov <ansereb@toloka.ai>
Co-authored-by: Blake Owens <76979297+blakeaowens@users.noreply.github.com>
Co-authored-by: Charles Neill <1749665+cneill@users.noreply.github.com>
Co-authored-by: tomaszn <tomaszn@users.noreply.github.com>
Co-authored-by: kiblik <tomas@kubla.sk>
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Felix Hernandez <ffhg_920522@hotmail.com>
Co-authored-by: Sebastian Gumprich <rndmh3ro@users.noreply.github.com>
  • Loading branch information
12 people authored Feb 15, 2024
1 parent 8d71ee4 commit 187309a
Show file tree
Hide file tree
Showing 40 changed files with 693 additions and 178 deletions.
5 changes: 4 additions & 1 deletion .github/renovate.json
Original file line number Diff line number Diff line change
Expand Up @@ -12,5 +12,8 @@
"commitMessageExtra": "from {{currentVersion}} to {{#if isMajor}}v{{{newMajor}}}{{else}}{{#if isSingleVersion}}v{{{toVersion}}}{{else}}{{{newValue}}}{{/if}}{{/if}}",
"commitMessageSuffix": "({{packageFile}})",
"labels": ["dependencies"]
}]
}],
"registryAliases": {
"bitnami": "https://charts.bitnami.com/bitnami"
}
}
13 changes: 8 additions & 5 deletions .github/workflows/fetch-oas.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,9 @@ on:
This will override any version calculated by the release-drafter.
required: true

env:
release_version: ${{ github.event.inputs.version || github.event.inputs.release_number }}

jobs:
oas_fetch:
name: Fetch OpenAPI Specifications
Expand All @@ -21,19 +24,19 @@ jobs:
- name: Checkout
uses: actions/checkout@v4
with:
ref: ${{ github.event.inputs.version }}
ref: release/${{ env.release_version }}

- name: Load docker images
run: |-
docker pull defectdojo/defectdojo-django:${{ github.event.inputs.version }}-alpine
docker pull defectdojo/defectdojo-nginx:${{ github.event.inputs.version }}-alpine
docker pull defectdojo/defectdojo-django:${{ env.release_version }}-alpine
docker pull defectdojo/defectdojo-nginx:${{ env.release_version }}-alpine
docker images
- name: Start Dojo
run: docker-compose --profile postgres-redis --env-file ./docker/environments/postgres-redis.env up --no-deps -d postgres nginx uwsgi
env:
DJANGO_VERSION: ${{ github.event.inputs.version }}-alpine
NGINX_VERSION: ${{ github.event.inputs.version }}-alpine
DJANGO_VERSION: ${{ env.release_version }}-alpine
NGINX_VERSION: ${{ env.release_version }}-alpine

- name: Download OpenAPI Specifications
run: |-
Expand Down
1 change: 0 additions & 1 deletion components/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,6 @@
"drmonty-datatables-responsive": "^1.0.0",
"easymde": "^2.18.0",
"flot": "flot/flot#~0.8.3",
"flot-axis": "markrcote/flot-axislabels#*",
"font-awesome": "^4.0.0",
"fullcalendar": "^3.10.2",
"google-code-prettify": "^1.0.0",
Expand Down
4 changes: 2 additions & 2 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -138,7 +138,7 @@ services:
volumes:
- defectdojo_data:/var/lib/mysql
postgres:
image: postgres:16.1-alpine@sha256:17eb369d9330fe7fbdb2f705418c18823d66322584c77c2b43cc0e1851d01de7
image: postgres:16.2-alpine@sha256:bbd7346fab25b7e0b25f214829d6ebfb78ef0465059492e46dee740ce8fcd844
profiles:
- postgres-rabbitmq
- postgres-redis
Expand All @@ -149,7 +149,7 @@ services:
volumes:
- defectdojo_postgres:/var/lib/postgresql/data
rabbitmq:
image: rabbitmq:3.12.12-alpine@sha256:fcd6a66524be55c15c81011dc87cc4b6e4405130fbb950c21ad1d31e8f6322dd
image: rabbitmq:3.12.12-alpine@sha256:9144c0eca261e36ffd1a3f9ef21a860242a4a60e0211bbade82c80910958a5e9
profiles:
- mysql-rabbitmq
- postgres-rabbitmq
Expand Down
3 changes: 3 additions & 0 deletions docs/content/en/usage/features.md
Original file line number Diff line number Diff line change
Expand Up @@ -557,6 +557,9 @@ Product Type Counts

![Product Type Counts](../../images/met_2.png)

Product Tag Counts
: Same as above, but for a group of products sharing a tag.

Simple Metrics
: Provides tabular data for all Product Types. The data displayed in
this view is the total number of S0, S1, S2, S3, S4, Opened This
Expand Down
8 changes: 8 additions & 0 deletions dojo/api_v2/serializers.py
Original file line number Diff line number Diff line change
Expand Up @@ -1133,6 +1133,14 @@ class Meta:
model = Tool_Type
fields = "__all__"

def validate(self, data):
if self.context["request"].method == "POST":
name = data.get("name")
# Make sure this will not create a duplicate test type
if Tool_Type.objects.filter(name=name).count() > 0:
raise serializers.ValidationError('A Tool Type with the name already exists')
return data


class RegulationSerializer(serializers.ModelSerializer):
class Meta:
Expand Down
133 changes: 133 additions & 0 deletions dojo/db_migrations/0201_populate_finding_sla_expiration_date.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,133 @@
from django.db import migrations
from django.utils import timezone
from datetime import datetime
from django.conf import settings
from dateutil.relativedelta import relativedelta
import logging

from dojo.utils import get_work_days

logger = logging.getLogger(__name__)


def calculate_sla_expiration_dates(apps, schema_editor):
System_Settings = apps.get_model('dojo', 'System_Settings')

ss, _ = System_Settings.objects.get_or_create()
if not ss.enable_finding_sla:
return

logger.info('Calculating SLA expiration dates for all findings')

SLA_Configuration = apps.get_model('dojo', 'SLA_Configuration')
Finding = apps.get_model('dojo', 'Finding')

findings = Finding.objects.filter(sla_expiration_date__isnull=True).order_by('id').only('id', 'sla_start_date', 'date', 'severity', 'test', 'mitigated')

page_size = 1000
total_count = Finding.objects.filter(id__gt=0).count()
logger.info('Found %d findings to be updated', total_count)

i = 0
batch = []
last_id = 0
total_pages = (total_count // page_size) + 2
for p in range(1, total_pages):
page = findings.filter(id__gt=last_id)[:page_size]
for find in page:
i += 1
last_id = find.id

start_date = find.sla_start_date if find.sla_start_date else find.date

sla_config = SLA_Configuration.objects.filter(id=find.test.engagement.product.sla_configuration_id).first()
sla_period = getattr(sla_config, find.severity.lower(), None)

days = None
if settings.SLA_BUSINESS_DAYS:
if find.mitigated:
days = get_work_days(find.date, find.mitigated.date())
else:
days = get_work_days(find.date, timezone.now().date())
else:
if isinstance(start_date, datetime):
start_date = start_date.date()

if find.mitigated:
days = (find.mitigated.date() - start_date).days
else:
days = (timezone.now().date() - start_date).days

days = days if days > 0 else 0

days_remaining = None
if sla_period:
days_remaining = sla_period - days

if days_remaining:
if find.mitigated:
find.sla_expiration_date = find.mitigated.date() + relativedelta(days=days_remaining)
else:
find.sla_expiration_date = timezone.now().date() + relativedelta(days=days_remaining)

batch.append(find)

if (i > 0 and i % page_size == 0):
Finding.objects.bulk_update(batch, ['sla_expiration_date'])
batch = []
logger.info('%s out of %s findings processed...', i, total_count)

Finding.objects.bulk_update(batch, ['sla_expiration_date'])
batch = []
logger.info('%s out of %s findings processed...', i, total_count)


def reset_sla_expiration_dates(apps, schema_editor):
System_Settings = apps.get_model('dojo', 'System_Settings')

ss, _ = System_Settings.objects.get_or_create()
if not ss.enable_finding_sla:
return

logger.info('Resetting SLA expiration dates for all findings')

Finding = apps.get_model('dojo', 'Finding')

findings = Finding.objects.filter(sla_expiration_date__isnull=False).order_by('id').only('id')

page_size = 1000
total_count = Finding.objects.filter(id__gt=0).count()
logger.info('Found %d findings to be reset', total_count)

i = 0
batch = []
last_id = 0
total_pages = (total_count // page_size) + 2
for p in range(1, total_pages):
page = findings.filter(id__gt=last_id)[:page_size]
for find in page:
i += 1
last_id = find.id

find.sla_expiration_date = None
batch.append(find)

if (i > 0 and i % page_size == 0):
Finding.objects.bulk_update(batch, ['sla_expiration_date'])
batch = []
logger.info('%s out of %s findings processed...', i, total_count)

Finding.objects.bulk_update(batch, ['sla_expiration_date'])
batch = []
logger.info('%s out of %s findings processed...', i, total_count)


class Migration(migrations.Migration):

dependencies = [
('dojo', '0200_finding_sla_expiration_date_product_async_updating_and_more'),
]

operations = [
migrations.RunPython(calculate_sla_expiration_dates, reset_sla_expiration_dates),
]
18 changes: 18 additions & 0 deletions dojo/db_migrations/0202_alter_dojo_group_social_provider.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
# Generated by Django 4.1.13 on 2024-01-25 00:07

from django.db import migrations, models


class Migration(migrations.Migration):

dependencies = [
('dojo', '0201_populate_finding_sla_expiration_date'),
]

operations = [
migrations.AlterField(
model_name='dojo_group',
name='social_provider',
field=models.CharField(blank=True, choices=[('AzureAD', 'AzureAD'), ('Remote', 'Remote')], help_text='Group imported from a social provider.', max_length=10, null=True, verbose_name='Social Authentication Provider'),
),
]
17 changes: 7 additions & 10 deletions dojo/filters.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@
from django.conf import settings
import six
from django.utils.translation import gettext_lazy as _
from django.utils import timezone
from django_filters import FilterSet, CharFilter, OrderingFilter, \
ModelMultipleChoiceFilter, ModelChoiceFilter, MultipleChoiceFilter, \
BooleanFilter, NumberFilter, DateFilter
Expand Down Expand Up @@ -148,16 +149,12 @@ def any(self, qs, name):
return qs

def sla_satisfied(self, qs, name):
for finding in qs:
if finding.violates_sla:
qs = qs.exclude(id=finding.id)
return qs
# return findings that have an sla expiration date after today or no sla expiration date
return qs.filter(Q(sla_expiration_date__isnull=True) | Q(sla_expiration_date__gt=timezone.now().date()))

def sla_violated(self, qs, name):
for finding in qs:
if not finding.violates_sla:
qs = qs.exclude(id=finding.id)
return qs
# return active findings that have an sla expiration date before today
return qs.filter(Q(active=True) & Q(sla_expiration_date__lt=timezone.now().date()))

options = {
None: (_('Any'), any),
Expand All @@ -184,13 +181,13 @@ def any(self, qs, name):

def sla_satisifed(self, qs, name):
for product in qs:
if product.violates_sla:
if product.violates_sla():
qs = qs.exclude(id=product.id)
return qs

def sla_violated(self, qs, name):
for product in qs:
if not product.violates_sla:
if not product.violates_sla():
qs = qs.exclude(id=product.id)
return qs

Expand Down
37 changes: 35 additions & 2 deletions dojo/forms.py
Original file line number Diff line number Diff line change
Expand Up @@ -2119,21 +2119,37 @@ def get_years():
return [(now.year, now.year), (now.year - 1, now.year - 1), (now.year - 2, now.year - 2)]


class ProductTypeCountsForm(forms.Form):
class ProductCountsFormBase(forms.Form):
month = forms.ChoiceField(choices=list(MONTHS.items()), required=True, error_messages={
'required': '*'})
year = forms.ChoiceField(choices=get_years, required=True, error_messages={
'required': '*'})


class ProductTypeCountsForm(ProductCountsFormBase):
product_type = forms.ModelChoiceField(required=True,
queryset=Product_Type.objects.none(),
error_messages={
'required': '*'})

def __init__(self, *args, **kwargs):
super(ProductTypeCountsForm, self).__init__(*args, **kwargs)
super().__init__(*args, **kwargs)
self.fields['product_type'].queryset = get_authorized_product_types(Permissions.Product_Type_View)


class ProductTagCountsForm(ProductCountsFormBase):
product_tag = forms.ModelChoiceField(required=True,
queryset=Product.tags.tag_model.objects.none().order_by('name'),
error_messages={
'required': '*'})

def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
prods = get_authorized_products(Permissions.Product_View)
tags_available_to_user = Product.tags.tag_model.objects.filter(product__in=prods)
self.fields['product_tag'].queryset = tags_available_to_user


class APIKeyForm(forms.ModelForm):
id = forms.IntegerField(required=True,
widget=forms.widgets.HiddenInput())
Expand Down Expand Up @@ -2388,6 +2404,23 @@ class Meta:
model = Tool_Type
exclude = ['product']

def __init__(self, *args, **kwargs):
instance = kwargs.get('instance', None)
self.newly_created = True
if instance is not None:
self.newly_created = instance.pk is None
super().__init__(*args, **kwargs)

def clean(self):
form_data = self.cleaned_data
if self.newly_created:
name = form_data.get("name")
# Make sure this will not create a duplicate test type
if Tool_Type.objects.filter(name=name).count() > 0:
raise forms.ValidationError('A Tool Type with the name already exists')

return form_data


class RegulationForm(forms.ModelForm):
class Meta:
Expand Down
12 changes: 6 additions & 6 deletions dojo/jira_link/helper.py
Original file line number Diff line number Diff line change
Expand Up @@ -1036,28 +1036,28 @@ def get_issuetype_fields(

else:
try:
issuetypes = jira.createmeta_issuetypes(project_key)
issuetypes = jira.project_issue_types(project_key)
except JIRAError as e:
e.text = f"Jira API call 'createmeta/issuetypes' failed with status: {e.status_code} and message: {e.text}. Project misconfigured or no permissions in Jira ?"
raise e

issuetype_id = None
for it in issuetypes['values']:
if it['name'] == issuetype_name:
issuetype_id = it['id']
for it in issuetypes:
if it.name == issuetype_name:
issuetype_id = it.id
break

if not issuetype_id:
raise JIRAError("Issue type ID can not be matched. Misconfigured default issue type ?")

try:
issuetype_fields = jira.createmeta_fieldtypes(project_key, issuetype_id)
issuetype_fields = jira.project_issue_fields(project_key, issuetype_id)
except JIRAError as e:
e.text = f"Jira API call 'createmeta/fieldtypes' failed with status: {e.status_code} and message: {e.text}. Misconfigured project or default issue type ?"
raise e

try:
issuetype_fields = [f['fieldId'] for f in issuetype_fields['values']]
issuetype_fields = [f.fieldId for f in issuetype_fields]
except Exception:
raise JIRAError("Misconfigured default issue type ?")

Expand Down
Loading

0 comments on commit 187309a

Please sign in to comment.