-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AsyncLocal Memory Usage in MemoryCache #45436
Comments
Tagging subscribers to this area: @eerhardt, @maryamariyan Issue DetailsDescriptionI recently analyzed a few memory dumps of an .NET 5.0 application which makes heavy use of MemoryCache. Two things, I could not really explain, stood out: MemoryCacheEntryOptionsFor each new cache entry a new Memory Usage of AsyncLocal in CacheEntryThis is the more interesting one, the memory usage of I wrote a short repro which shows the problem:
using System;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Caching.Memory;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
namespace CacheEntryRepro
{
class Program
{
static async Task Main(string[] args)
{
var builder = Host.CreateDefaultBuilder().ConfigureWebHostDefaults(whb =>
whb.ConfigureServices(s => s.AddMemoryCache())
.Configure(app =>
{
app.UseRouting();
app.UseEndpoints(erb => { erb.MapGet("/", hc => RequestDelegate(hc)); });
}));
await builder.Build().RunAsync();
}
private static async Task RequestDelegate(HttpContext context)
{
var cache = context.RequestServices.GetRequiredService<IMemoryCache>();
int processors = Environment.ProcessorCount;
var tasks = new Task[processors];
for (int i = 0; i < 1000; i++)
{
for (int j = 0; j < processors; j++)
{
tasks[j] = AddCacheEntry(cache);
}
await Task.WhenAll(tasks);
}
context.Response.StatusCode = StatusCodes.Status200OK;
await context.Response.StartAsync();
await context.Response.WriteAsync("Done");
await context.Response.CompleteAsync();
}
private static Task AddCacheEntry(IMemoryCache cache)
{
var guid = Guid.NewGuid();
using var entry = cache.CreateEntry(guid.ToString());
entry.Size = 1;
entry.SlidingExpiration = TimeSpan.FromMinutes(1);
entry.Value = Guid.NewGuid();
return Task.CompletedTask;
}
}
} This is partially related to the PR #45280 from @adamsitnik as he specifically lists that AsyncLocal is expensive and he optimized the codepaths related to some of it. I have not tested his PRs if they help with the problem above, he suggested on twitter I should create an issue with a repro.
|
It seems that the using (var entry = cache.CreateEntry(key))
{
entry.SetValue(obj);
using (var entry1 = cache.CreateEntry(key1))
{
entry1.SetValue(obj);
entry1.AddExpirationToken(expirationToken);
}
} Currently there is no way to opt-out of this behavior and avoid these allocations. I was hoping that I could add a new internal runtime/src/libraries/Microsoft.Extensions.Caching.Abstractions/src/IMemoryCache.cs Line 26 in 69d6127
I can't find a way to achieve that without introducing a new public API. @eerhardt @maryamariyan before I write an API proposal, do you believe that avoiding these allocations is worth introducing a new public API? |
@Tornhoof definitely! is there any chance that you would like to contribute to the docs? |
Yeah, I'll create a PR for the docs repo in the next few days. |
@davidfowl @Tratcher - thoughts on making the "linked CacheEntry" feature optional? I think I remember discussing maybe removing it altogether. Given the above analysis, it appears to be a pretty expensive feature even in the case where a user doesn't use it. |
The "magic" factor of this feature has always been a problem. Letting people opt out of it is a good idea. How you do that without breaking people that are using it is a good question. Trying to opt out on a per entry bases seems like it's going to be a challenge, the scope is created too early. We don't want to duplicate a bunch of APIs unnecessarily. What about adding one option on MemoryCacheOptions to opt out for the whole cache? Once there's an API to control the feature we could discuss changing the default value in a future version. |
I would make a change to turn this feature off by default. It adds overhead for scenarios that don't need it. |
Description
I recently analyzed a few memory dumps of an .NET 5.0 application which makes heavy use of MemoryCache. Two things, I could not really explain, stood out:
MemoryCacheEntryOptions
For each new cache entry a new
MemoryCacheEntryOptions
object was created, but at least for the options used by the app it was not necessary to create a new object for each new cache entry, so it was removed and replaced by directly setting the appropriate values on cacheEntry (or using the extension methods). Maybe it would be useful to enhance the docs to point to the extension methods. The examples in https://docs.microsoft.com/en-us/aspnet/core/performance/caching/memory?view=aspnetcore-5.0 createMemoryCacheEntryOptions
most of the time instead of using the extension methods from https://github.com/dotnet/runtime/blob/a2f81e7258d2963af52d4441e891025aa46a1fc3/src/libraries/Microsoft.Extensions.Caching.Abstractions/src/MemoryCacheExtensions.cs. I guess that's the reason why a new options object was created each time.Memory Usage of AsyncLocal in CacheEntry
This is the more interesting one, the memory usage of
(this is from the real dump, it is not just 1 request, but a batch of them, probably around 400 or 500)
System.Threading.AsyncLocalValueMap+ThreeElementAsyncLocalValueMap
for the ExecutionContext was actually the third largest memory allocation behindstring
andbyte[]
. See Screenshot.I wrote a short repro which shows the problem:
This is partially related to the PR #45280 from @adamsitnik as he specifically lists that AsyncLocal is expensive and he optimized the codepaths related to some of it. I have not tested his PRs if they help with the problem above, he suggested on twitter I should create an issue with a repro.
The text was updated successfully, but these errors were encountered: