Processing SSAS in Tabular Mode not refreshing data - by Gilbert Q

Status : 

  Not Reproducible<br /><br />
		The product team could not reproduce this item with the description and steps provided.<br /><br />
		A more detailed explanation for the resolution of this particular item may have been provided in the comments section.

Sign in
to vote
ID 685325 Comments
Status Closed Workarounds
Type Bug Repros 2
Opened 8/24/2011 1:08:17 AM
Access Restriction Public


When I right click in SSAS Tabular Mode and select Process and select FULL Process, the data does not get updated or refreshed with the current data from the queries. Also when you refresh the data in Visual Studio 2010, the data is correct in your Project, but even when you deploy this project to the server it still does not update or refresh the data.

The only way that I have got this to work is to use SSMS 2008, and then only a FULL Process which comes up with the old version of SSAS Processing does it then update or refresh the data. 

I have also tried this within Denali SSIS and it also does not refresh the data.
Sign in to post a comment.
Posted by lverhelst on 3/10/2014 at 10:18 PM
I have experienced the same issue and want to know when you change the properties from default to "Full" how does this affect partitions. Eg. If I am deploying a weekly update for my current week partition will I get a full deploy of the database (all partitions) or just an update to the data in the current week partition?
Posted by TheGr8DB on 11/23/2012 at 5:03 PM
When I process the Tabular Database from SSMS with Process FULL, data does not get updated.
When I process the tabluar mode in BIDS and then deploy from there with Process Default/Full option, data is getting populated.

Posted by Gilbert Q on 8/25/2011 at 10:52 PM

Many thanks for the comments and explanation and it does make sense. And you are correct with changing the Processing Option to Full in BIDS.

I do think with regards to the SSMS and IS tasks I was doing it incorrectly. And now looking again at the IS tasks that I have put into the IS Server and then used SQL Agent to deploy is where I have got it wrong and can relook at my issue. My job is taking 7 seconds to complete when I know that each task takes at least 20 seconds each. So I will look into this on my side.

I am glad that at least you guys did find one small bug.

Posted by Microsoft on 8/25/2011 at 2:02 PM

Thanks for reporting this issue. There are a couple of things in this report. One is a bug. One is the way the product works. Let me explain.

When you deploy from BIDS, we do two things. We deploy the metadata from the .bim file to the server. Optionally, we send a process command to the engine. We do not actually send the data from the designer to the engine. The data in the designer lives in a temporary workspace database, and does not get cloned or passed along. This is to ensure that the correct credentials are used for processing (if impersonating the service account) so there are no surprises on future processing operations.

To look at the processing command we send at deployment time, right click the .smproj and select Properties. There is a "Processing Option" property that determines what happens. The default value is "Default". This means we let the Analysis Services engine decide whether or not it is necessary to process the data. When you do only a minor metadata change (or just a data refresh), the engine will decide that the deployment database doesn't need processing, so it doesn't add the new data. This is why you don't see your data change at deployment time.

If you would like the engine to always process your data at deployment time, change the Processing Option from "Default" to "Full". This will make sure you have the latest data at deployment time.

Now to the SSMS and IS tasks. We could not reproduce the issue where data didn't come when issuing the Process Full command on the database. Also we could not reproduce issues with the integration services processing task. Next time you have troubles, can you please use Profiler to connect to the AS instance and see if the engine is reporting failures during the processing. Could be database specific.

We did, however, reproduce an issue where the UI says it did a process recalc on the data but in actual fact a process full happened on the back end. We will get a developer to look at that issue in the future.

Cathy Dumas
Program Manager, Analysis Services