Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add function to loop over all ports to identify deleted ports #128

Closed

Conversation

arjunsalyan
Copy link
Member

This is another part of #122 .

We need to run this function at least once so that the ports that were missed to be marked as deleted get marked correctly. However, I am not sure of the right way to do this.

Any suggestions: @mojca @umeshksingla

@codecov
Copy link

codecov bot commented Oct 10, 2019

Codecov Report

Merging #128 into master will decrease coverage by 0.52%.
The diff coverage is 9.09%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #128      +/-   ##
==========================================
- Coverage   73.84%   73.31%   -0.53%     
==========================================
  Files          46       46              
  Lines        1338     1349      +11     
==========================================
+ Hits          988      989       +1     
- Misses        350      360      +10
Impacted Files Coverage Δ
app/ports/models/port.py 85.03% <9.09%> (-3.44%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update bafaefc...51e82d4. Read the comment docs.

Copy link
Member

@mojca mojca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks ok to me. I guess that another step is needed to actually trigger this, right?

I left some notes, and tests cannot hurt, but I also don't mind if it goes in without modifications, just to get the database fixed.

@classmethod
def full_deleted_ports_run(cls):
all_ports_from_json = set()
data = Port.PortIndexUpdateHandler().sync_and_open_file()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I need to check again, but does running this function always fetch the latest version of the json file?

I kind of feel that the functionality should be separate from rsync-ing the file.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, the function sync_and_open_file first fetches the latest file and then reads it.


# Loop over all the ports in the database and check if it is available in all_ports_from_json
with transaction.atomic():
for port in Port.objects.all().only('name', 'active'):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would additionally filter out non-active ports. Once you end up with 80% of the ports being deleted, this will make a significant difference.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right, thanks. Very fair point.

@arjunsalyan
Copy link
Member Author

Looks ok to me. I guess that another step is needed to actually trigger this, right?

Yes, and I am not sure how do we trigger this. Maybe a migration?

@mojca
Copy link
Member

mojca commented Oct 10, 2019

Looks ok to me. I guess that another step is needed to actually trigger this, right?

Yes, and I am not sure how do we trigger this. Maybe a migration?

Hmmm ... actually migration sounds like a suitable idea (we do need to fix the database, after all). Maybe add a comment next to the function saying that we had to fix the situation once due to buggy code, but that we might want to run that code every now and then. We don't immediately need the perfect solution, and having a function available + running it once might do for now, and then we can think about the best ways to potentially do full updates.

I do imagine cases like when we change the format of json to create more user-friendly licence fields. In such cases we'll need to run through the whole database as well.

Bottomline: we can merge this before implementing migration (or together, if ready), and worry about the best way to run this later.

@arjunsalyan
Copy link
Member Author

arjunsalyan commented Oct 20, 2019

We can merge this together with the required migration file. I will add the migration file after we merge #119 just to avoid any conflicts in the migrations.

@arjunsalyan
Copy link
Member Author

I am unable to call this model method from the migrations. Seems like we are not allowed to do this unless I move all the code inside the migration file itself.

@arjunsalyan
Copy link
Member Author

The new update mechanism has an improved version of this. This PR can be closed.

@arjunsalyan arjunsalyan closed this Aug 3, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants