-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot override default cache key with fragment caching #1804
Comments
In master I've removed the dynamic serializers. I'm curious if you can find a better solution there... in master def fetch_attributes_fragment(adapter_instance)
serializer_class._cache_options ||= {}
serializer_class._cache_options[:key] = serializer_class._cache_key if serializer_class._cache_key
#....
key = cache_key(adapter_instance) so, def cache_key(adapter_instance)
return @cache_key if defined?(@cache_key)
parts = []
parts << object_cache_key
parts << adapter_instance.cache_key
parts << serializer_class._cache_digest unless serializer_class._skip_digest?
@cache_key = parts.join('/')
end |
Also, any particular reason you're skipping the digest? |
@bf4 No strong reason other than I purge all cache on deploy, so computing digests is extra computation that isn't required. |
Ok, so looks like @bf4 is off the project? All comments, mentions, and PRs and gone |
@onomated github thinks I"m a robot |
@bf4 You probably are a robot with all the activity you're balancing at a time. Took a look at master and my Thanks for your hard work! |
Turns out I made too many links in an issue on my blog repo (was just using it to link dump) |
@bf4 May be related to your recent changes to caching, but I started getting this error during cache fetch. See stack trace below. AMS commit hash is up-to-date with master. There have been no changes to application code since:
|
@bf4 Error above is a result of fragment caching not being invoked. AMS is attempting to cache an excluded field in my entity. I've inspected my redis cache and confirmed that excluded fields are being cached. Stepping through |
Do you think you can narrow this down to a failing test? I can pair w you if it helps B mobile phone
|
@onomated any debug info you can provide here is helpful. I was going to release 0.10.1 but am hesitant if there's a bug |
@bf4, Sorry just seeing this. I can take a look at writing specs for this later today. Cursory glance at the |
I just released 0.10.2 which fixes the fragmented cache with read_multi issue https://twitter.com/hazula/status/750394854115450880 |
@onomated What's needed to consider this issue resolved? |
@bf4 Swamped with looming deadlines, but what I did in the time being was to disable caching all together for those models. I can re-enable and verify, but it is a pretty straightforward verification. I rendered json for my models and inspected the fields saved in cache expecting the my |
would love to know what this entails. |
@bf4 Still seeing excluded fields in cache with |
I noticed my |
@rromanchuk probably due to #1642. The |
@groyoh thanks for the heads up, i haven't dug into the cache mixin enough to actually know what is going on. I had a feeling i should maybe try and override cache_key as a sanity test. If i'm only trying to rename the key what's the best practice? Would something like |
@rromanchuk I'm not completely sure what you're trying to achieve so I guess that depends on whether you use multiple adapters or not. If you don't then I guess it's fine. Otherwise I would strongly recommend that you override the class BaseSerializer < ActiveModel::Serializer
def object_cache_key
"#{serializer_class._cache_key}_#{object.cache_key}"
end
end
class MyCachedSerializer < BaseSerializer
cache key: 'custom_key'
end That would result in the following key Disclaimer: I'm just saying that out of my mind. I did not take the time to make sure it actually works properly. Also if you have further questions, it may be better to open a new issue with a proper description of your troubles. |
@groyoh this looks exactly like what i need, i'll let you know how it goes |
|
If we are explicitly overriding the cache key in the serializer I would think that would take precedence. In Rails, every model implements Also, the current implementation requires the model's If the current implementation is really preferred, I would suggest that there is at least some warning when the serializer cache key is ignored. We were just burned by this in production because we added one existing field to the JSON and thought an update to the serializer cache key would fix the issue. Having one serializer for User and one serializer for UserWithAuthToken is a pretty common use-case for multiple serializers for one model. I would not be surprised if there applications out there that are unknowingly caching UserWithAuthToken and then serving that JSON to other requests that use the normal UserSerializer. You can imagine how bad this would be, but the developer would never know about it unless he inspected the JSON response of his/her own application! Seems dangerous. |
I was indeed leaking my authenticated user serializer. 🙈 |
@lserman sorry to hear that you had trouble with that. I'll try to dig and see if there is a deeper reason to that specific change. The issue right now is that changing this may imply a breaking change so I'll try to see what we can do. |
@onomated |
@TONYHOKAN I have since disabled Fragment caching. In my case it actually was less performant than without caching, as noted here. For some of my endpoints, I cache the full response (including headers to retain pagination info) directly in rails cache, for data that doesn't change frequently. |
@onomated Thanks for your reply. It seems that i can only disable fragment caching now and keep track of this problem. |
Expected vs Actual Behavior
I have a model say
User
that I would like to present in two representations say aUserSerializer
andProfileSerializer
. My serializers are defined as follows:The generated cache keys do not follow the overidden
object_cache_key
and fall-back to the AMS default that is based on the model name. SoProfileSerializer
responses overwrite responses of theUserSerializer
in cache since they both act on the same model.Is there a way to provide a cache prefix or overwrite the default cache key generated in fragment caching? I have tried
cache key: 'some_prefix', skip_digest: true except: [:num_following]
and that also doesn't work.This issue only occurs with fragment caching due to the dynamic creation of Cached vs NonCached variants of the serializer. Works fine without fragment caching.
Steps to reproduce
See examples above
Environment
Rails 4.2.2, Grape 0.16
The text was updated successfully, but these errors were encountered: