I have decided to write about SSAS 2008 errors that I come across occasionally, either by my own hand or things that I see my customers doing from time to time. In particular, if I’ve had trouble finding the resolution anywhere else on the Internet, I think it’s important to share my resolution to save other people some time!
Issue: One bugbear is the following error message:
Errors in the back-end database access module. The size specified for a binding was too small, resulting in one or more column values being truncated.
Resolution:
This is my most un-favourite error message of all time. If you get this error message, it can be difficult to identify where the issue is. This error seems to arise if you do any changes to a view or table underlying the DSV, which can arise in a mismatch.
The error message gives you details of the offending column. The fix seems to be to re-open the DSV, close it again, then reprocess dimension, then cube. If not, you can try to wade through the code version of the DSV and see where the mis-match is.
It might be worth having a look at the health of the server. I have noticed that I don’t get this error message on some machines. However, I have had this error message repeatedly on one machine that belonged to one of my customers, which was a server that’s slowly grinding to a halt. Since it wasn’t my machine, I wasn’t in a position to do very much except try and get their IT support to look at it.
It seems that prevention is the best way to get rid of this error message. Having had this error message at the worst times, I now try and prevent it by using Views in the DSV. For every column in the view, I specifically cast it as a particular type in the view.
Sometimes it’s been near impossible to work out where the issue is, so I’ve simply had to start again by casting every column in the cube as a type, and then re-creating a new cube using the dimensions etc that I’ve already created. It’s not great but it can sometimes be quicker just to do this.
I hope that this helps!
Great blog – just started reading through it. I ran into this issue a short time ago – it was expected, I knew that I'd changed some field widths.
I refreshed the DSV, but that didn't do it.
I went to the dimension, found the attribute, dug deep into the properties and found the column type definition and changed that, but that didn't fix it.
I found that you had to “unlink” the dimension from the cube by removing it from the “dimension usage” tab in the cube, THEN fix the data type in the dimension (which had reverted!), process the dimension, then link it up again to the cube.
I'll keep reading!
That's a great tip – thank you very much!
Hi Jen,
I've had this issue today after a table was rebuilt with changes in data types. Funny thing was the error was on the NameColumn attribute in the dimension structure, but this datatype hadn't changed! i pointed it back at it self and it was fine. weird
Ive run into this many times and yes it can be hard to debug. The Data Type Discrepancy Check feature in the BIDSHelper addin can really help here. However this helps fix synch problems within dimensions only.
You cant work out design or ETL issues in this way. So, remember to check/match the data types and lengths of the foreign key columns in the Fact table to whichever columns they join to in the dimension tables.
I had a join to a non-key attribute of type nvarchar(20) setup in the Dimension Usage area and kept getting a binding error when trying to process cube – complaining about truncation. However, no offending column is specified in the error!
Eventually worked out that this was a length mismatch between fact and dimension columns specific to that dimension usage join. The fact column was nvarchar(30) and the dimension was nvarchar(20).
To further complicate it, i had this same join used for 2 facts to the same dimension. One partition would always process, the other just wouldnt. Turned out the fact that couldnt process was the only one actually containing values longer than 20 chars, causing truncation and hence the error!
So overall this was an ETL issue – the wrong data was being stored in the fact foreign key. In the end i made them all nvarchar(20)_ which was the correct length as per specs.. then and also fixed the ETL. *phew*
I ran into this issue last week but in my case it had nothing to do with types but rather with field widths as Todd pointed out. I blogged about it here, hope this will be helpful
http://shahfaisalmuhammed.blogspot.com/2012/03/size-specified-for-binding-too-small.html
thx, it help me
I got same error in existing cube, and i dont have choice to recreate as it was created by someone else, what i find out that in error section i got 2 different column data size error from same same dimension, and was focusing on only one of the column but no luck but try to check data size of out error ed column where i find out clue and i have change size from dimension designer column property from 4000 to 5000 as in DB new data came bit large which was unknown for me and it went fine.
thanks Jen you guide well.