{"id":2511,"date":"2014-04-24T20:52:21","date_gmt":"2014-04-25T04:52:21","guid":{"rendered":"http:\/\/www.atumvirt.com\/?p=2511"},"modified":"2014-04-24T20:52:21","modified_gmt":"2014-04-25T04:52:21","slug":"xendesktop-77-1-monitor-service-memory-leak","status":"publish","type":"post","link":"https:\/\/avtempwp.azurewebsites.net\/2014\/04\/xendesktop-77-1-monitor-service-memory-leak\/","title":{"rendered":"XenDesktop 7\/7.1 Monitor Service Memory Leak"},"content":{"rendered":"

In late February and early March we experienced “All the Citrix XML Services configured for farm failed to respond to this XML Service\u00a0transaction.” \u00a0As virtual desktops have grown increasingly important in our organization, this failure was critical to resolve quickly. \u00a0Unfortunately, the first time it happened, reboots of the delivery controllers and storefront servers didn’t resolve the problem and it went away on its own. \u00a0When it happened the next week in early March, we decided to take an “emergency” change request to upgrade from XD7 to XD 7.1. \u00a0Unfortunately, about a week and a half later the problem occurred again and we contacted our Citrix rep. \u00a0However, when it occurred the third time I noticed something strange: \u00a0The Citrix Monitor Service on one of the DC’s had memory usage that stuck out like a sore thumb: \u00a0A whopping 3.7gb compared to the other DC having a mere 477mb.<\/p>\n

After seeing this behavior, I configured service monitoring using System Center Operations Manager and set a memory threshold of 800MB. \u00a0If you have System Center Operations Manager, you owe it to yourself to configure monitoring of critical XenDesktop services like this. \u00a0Once I did, I was able to discover dramatic behavior.<\/p>\n

\"MonitorServiceMemory\"<\/a><\/p>\n

 <\/p>\n

The memory utilization shot from about 700mb to nearly 3gb basically in just a few moments. \u00a0I received an e-mail alert from Operations Manager, so I restarted the service, trying to be ahead of the problem. \u00a0However as you see on the chart, the problem occurred again in a short time, and I restarted again. \u00a0This happened two more times that night before I decided to throw in the towel and simply let the service use 3gb of memory while I waited to apply the private fix I obtained from Citrix. \u00a0If you are experiencing this problem you should contact Citrix to obtain the private fix. \u00a0I’ve been told this issue can occur with XenDesktop 7 as well (it may well have occurred for me before, I simply was not monitoring for it until we had broker issues).<\/p>\n

 <\/p>\n

I’m pleased to report after installing the private hotfix the issue has not reoccurred. \u00a0We were lucky – the memory utilization was “only” about 3-4GB, and our DC’s have very large RAM assignments because we previously saw a similar leak in XD 7 with the broker service while using Hyper-V\/SCVMM (a problem we haven’t been able to duplicate since switching to VMware, oddly enough). \u00a0Support agent said that other users reported that XML services failed to respond when this issue occurred, which may have been due to massive disk thrashing if the server didn’t have enough RAM to handle such a drastic increase.<\/p>\n

 <\/p>\n

*Update*<\/strong><\/p>\n

After applying the patch last Tuesday I’m not entirely convinced the leak\u00a0is gone, but it is certainly better in that it isn’t all at once. \u00a0Notice the steady increase over the past few days since the patch application 4\/23<\/p>\n

\"Despite<\/a>

Despite having the private hotfix, the memory does seem to be growing steadily<\/p><\/div>\n","protected":false},"excerpt":{"rendered":"

In late February and early March we experienced “All the Citrix XML Services configured for farm failed to respond to this XML Service\u00a0transaction.” \u00a0As virtual desktops have grown increasingly important in our organization, this failure was critical to resolve quickly. \u00a0Unfortunately, the first time it happened, reboots of the delivery controllers and storefront servers didn’t […]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[10,33,35,77],"tags":[84,100,111,122],"_links":{"self":[{"href":"https:\/\/avtempwp.azurewebsites.net\/wp-json\/wp\/v2\/posts\/2511"}],"collection":[{"href":"https:\/\/avtempwp.azurewebsites.net\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/avtempwp.azurewebsites.net\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/avtempwp.azurewebsites.net\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/avtempwp.azurewebsites.net\/wp-json\/wp\/v2\/comments?post=2511"}],"version-history":[{"count":0,"href":"https:\/\/avtempwp.azurewebsites.net\/wp-json\/wp\/v2\/posts\/2511\/revisions"}],"wp:attachment":[{"href":"https:\/\/avtempwp.azurewebsites.net\/wp-json\/wp\/v2\/media?parent=2511"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/avtempwp.azurewebsites.net\/wp-json\/wp\/v2\/categories?post=2511"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/avtempwp.azurewebsites.net\/wp-json\/wp\/v2\/tags?post=2511"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}