Prevent Cache Limit from Causing Application Deployment Failure
Once the cache has been filled all subsequent application deployments fail until the cache self-cleans. In CAS.log you see that the client is refusing to download the content because the cache, not the actual disk, is full.
This strikes me as a non-optimal design choice if the goal is to successfully install applications. If I want to install an application I do not want it to fail because of an artificial limit that until very recently was set at the time of client install. Most of the time the cache clears in 24 hours and there's no problem but there are two specific and prevalent cases where it is.
Task Sequences: During OSD we want every app and every update applied before the user gets the machine and we want it done quickly. It's very easy to go over a limit that day-to-day is ideal. This has forced SCCM admins to either abandon the application model during TS (a common MVP suggestion) or write custom scripts to temporarily increase the cache. Neither of these are ideal.
Large Applications: There are some applications (ex. CAD) that are just incredibly large (20+ GB) and all their own will exceed what an organization desires to reserve for cache size. Again, admins are either abandoning the app model entirely for these or running temporary cache modification scripts. This simply shouldn't be necessary for a system designed to deploy applicatons.
So, what to do? One option would to implement a rolling (first in, first out) cache where if cache space is needed the oldest non-persistent content for a compliant application is removed to make space. If the new app's content is larger than the cache itself then simply don't cache it. The only hard limit should be on the amount of free disk space to reserve before the SCCM client refuses to download content of any type.
TL;DR: Stop breaking application deployment just because the cache is full. No one wants to troubleshoot app failures.
This is shipped in #MEMCM/#ConfigMgr 1910
It's so disappointing that this is only fixed in Task Sequences, but not for large applications through software center. I'd still consider Bryan's other linked user voice (to execute directly from a share) a workaround for this issue. We really need a first in, first out rolling cache so you don't need to make your cache size equal to your entire catalogue total size to guarantee they won't fail (which is of course ridiculous). 20GB+ apps are quite common these days.
What about outside Task Sequences? Regular app deployments, update deployments all stop when the cache is full.
Or is this all fixed in 1910?
Bryan Dam commented
The first use case (first-day/Task Sequences) is resolved, great job team.
The second use case (single large app) is not so let's dog-pile on this one: https://configurationmanager.uservoice.com/forums/300492-ideas/suggestions/8875516-allow-deployments-using-application-model-to-be-in
Eric van Voorthuizen commented
This will be solved in 1906 for TS step for App Deployment, that gives you more control over cache.
But what about applications? Since in 1906 Application Groups is introduced as pre-release we need an option to manage ccmcache during application(group) deployment.
Taylor Harris commented
I agree with adding more control of this inside of the task sequence and application model itself, but for anyone else looking for a workaround without changing default client settings in the meantime, I've had success setting the 'SMSCACHESIZE' installation property in the "Setup Windows and ConfigMgr" task sequence step.
Doing this allows you to specify a cache size during OSD when the client is initially installed/initialized in the SysPrepped OS, but it will eventually get set back to your default client settings specification after the task sequence completes and a machine policy evaluation runs.
IT Helpdesk commented
Is there a bug report for the cache not being deleted as intended? The suggested solution is a work-around, not a resolution for the problem. I have multiple clients CAS.log showing failures to delete. Is Microsoft debugging these failures?
Error: DeleteDirectory:- Failed to delete Directory with Error 0x00000003. ContentAccess 5/31/2019 1:00:20 AM 432 (0x01B0)
Error: DeleteDirectory:- Failed to delete Directory C:\WINDOWS\ccmcache\c9.BCWork with Error 0x00000002. ContentAccess 5/31/2019 1:00:20 AM 432 (0x01B0)
I agree with the idea, makes total sense, but how about we just add a tick box on the Application/Package where it deletes the locally cached content after a successful install? Like there is a tick-box for 'Persist content in the client cache'? That way, once the Application is compliant, it will just delete its downloaded source files. The cache will always then stay pretty much clean. I'm frustrated with downloads that are stuck halfway because the cache is full, and on emptying the cache, the application refuses to re-download. For some reason Auto-clean cache does not work at my place :(
This is most frustrating when you have a run a script to increase the cache size to accommodate a larger sized package. Could there be an option to specify the cache size change / increase in the application settings?
How about when an install needs more space in the cache it cleans the cache then. When a new machine is setup many software titles are installed an the cache runs out of space even thou 99% of the cache is marked as can be deleted. (Adobe Creative Suite products are a problem here)
Blake Erwin commented
+1 for the idea of deployment setting that allows us to ignore cache limit for a specific application.
Additionally, further flexibility/higher limits should be allowed configurable to the client cache size. If I set client settings to have a cache limit of 100GB, I notice that it sets the limit to "99,999" MB. The ability to increase the cache size further would be useful.
L U commented
Like Zeb's ideas - and have the time persisting be in hours or even minutes, not just days, from when install completes. Allow 0 minutes for instant removal after final detection test is passed.
Zeb Smith commented
I'd love have a setting to allow the cache to use up disk until x% free space is remaining and another to tune how long content persists in cache.
The first could be system-wide and the second per Application.
That way, if I have a gigantic application to roll out, we can let it use up a large portion of disk for a short amount of time, and then shrink itself back down after a completed install.
Eric Van Boven commented
We have seen this so many times at our school. Cannot believe SCCM operates like this currently.
Kevin Street commented
I would also like to suggest an option where during deployment of an application you can tick a box, something along the lines of "Temporarily increase client cache size to accommodate this application" so that applications that are bigger than the usual cache size will install without error.
L U commented
This is a great case for allowing Applications to deploy from distribution points ... See https://configurationmanager.uservoice.com/forums/300492-ideas/suggestions/8875516-allow-deployments-using-application-model-to-be-in
... and please would you add your votes if you haven't already? Thanks!