Quantcast
Channel: daxstudio Discussions Rss Feed
Viewing all 324 articles
Browse latest View live

New Post: not a valid table expression

$
0
0
Hi, thanks for the tool. i'm having trouble getting started, though

I'm using DAX Studio 2.3.6 and connecting to an Excel PowerPivot model. It's Excel 2016, and the file is an xlsb not xlsx (don't know if that makes any difference)

I'm trying a very simple measure, to get started:

EVALUATE
(
SUM ( 
    MasterDetail[Total_Quota] 
    ) 
)

and instead of results, it gives me this message:

Query (1, ) The expression specified in the query is not a valid table expression

can you please give me a clue?

New Post: not a valid table expression

$
0
0
nevermind, I think I have it. This tool is useful for returning Tables, or DAX statements that evaluate to tables. is that right?

New Post: not a valid table expression

$
0
0
Yes, that is correct DAX Studio sends queries to a tabular model and displays the results. The EVALUATE statement expects a table expression and SUM returns a scalar value. So the error coming back from the model is correct.

To return a scalar value you can use the row function.

eg.
EVALUTE
  ROW( "My Sum"
  , SUM( MasterDetail[Total_Quota] )
)

New Post: portable version?

$
0
0
Hello there, is it possible to have DAX Studio run as a standalone version without (exe) installer?
Cheers, chefe

New Post: portable version?

$
0
0
Yes, it would be. The installer is mainly there to:
  • check that the dependencies are installed
  • to hook up the Excel add-in.
  • to set DAX Studio as the default handler for .dax files
If you just want to work with SSAS servers or Power BI Designer then after installing you could just copy the Dax Studio folder from the install location onto a usb key and it would work. But that's only if the required dependencies, which is .net 4.5 and the SQL Server AMO and ADOMD libraries are installed. If those dependencies are not there the program cannot start. We'd have to try to build some sort of bootstrapper to make it a more robust portable experience.

There are also a few settings in the options window which we write to the current users registry if they are changed from the defaults and the query history gets stored in the current users %appdata% folder. So there would be some minor changes to create a config option to write these to the local folder to make it truly portable, but it would be possible.

New Post: portable version?

$
0
0
This discussion has been copied to a work item. Click here to go to the work item and continue the discussion.

New Post: Newbie trying to get started

$
0
0
So there are 2 issues here.

First is that you can't execute a DEFINE statement on it's own, it does not return anything. Which is why you get the error about the result not being a rowset. You could fix this by adding an EVALUATE statement. To return a single value that calculates the grand total for profit you could use the ROW function.

eg.
DEFINE 
MEASURE PPC[profit] = CALCULATE(
                                    VALUES(PPC[GPTV]),
                                    FILTER(
                                               ALL(PPC),
                                               PPC[Month]=VALUES(DateKey[Month])  &&
                                               PPC[Depot]=VALUES(Depot[Depot])
                                               )
                                    )
EVALUATE 
ROW("Profit", PPC[profit] )
If you wanted to see the profit split out by another column, like the month for instance you could use a pattern like the following with the ADDCOLUMNS function
DEFINE 
MEASURE PPC[profit] = CALCULATE(
                                    VALUES(PPC[GPTV]),
                                    FILTER(
                                               ALL(PPC),
                                               PPC[Month]=VALUES(DateKey[Month])  &&
                                               PPC[Depot]=VALUES(Depot[Depot])
                                               )
                                    )
EVALUATE 
    ADDCOLUMNS( 
        VALUES('Date'[Month])
        , "Profit"
        , PPC[profit] 
    )
The second problem is the underlying issue with your calculation. With the above queries DAX Studio is going to return the same error about a table with multiple values being supplied. The problem here is that CALCULATE expects a function that returns a scalar value (ie. a single value) while the Values function returns a single column table. It gets a little bit grey here as in some circumstances VALUES() can return a scalar value because if you have a table function that returns a table with one column and one row the tabular engine will do an implied cast of that table to a scalar value, but usually if you are using that pattern you will "protect" the VALUES call with an if expression. eg. IF( HASONEVALUE( VALUES(PPC[GPTV]) ), ... , ... )

But most of the time if you are using CALCULATE the first expression should either refer to a measure or to an aggregate function like SUM, MAX, MIN, AVG, etc.

So I'm guessing that for a profit measure you'd want to see the SUM so you should use an expression like the following:
DEFINE 
MEASURE PPC[profit] = CALCULATE(
                                    SUM(PPC[GPTV]),
                                    FILTER(
                                               ALL(PPC),
                                               PPC[Month]=VALUES(DateKey[Month])  &&
                                               PPC[Depot]=VALUES(Depot[Depot])
                                               )
                                    )
EVALUATE 
ROW("Profit", PPC[profit] )

New Post: Installation with ConfigMgr

$
0
0
Hi
I use ConfigMgr to install all applications. It uses the system account for all the installations, but for some reason the setup dose not recognize the prerequisites and want to download them. The computers do not have direct internet access so it fails. When I try the "Dependency Check" script it give me the correct output, any switch to disable the prerequisites check?
Also we use AppLocker, any signing of the executable in a near future?

New Post: Installation with ConfigMgr

$
0
0
That sounds a bit strange, but maybe one of the checks is returning a false negative for the system account. Can you add a /LOG=<filename> parameter to the install command and attach a copy of the log file (you may need to create an issue to attach files)? But it shouldn't be too hard to add a command line parameter to skip the dependency checks and installs.

I have not looked into signing the app, but I don't think that would be too hard.

New Post: Installation with ConfigMgr

New Post: Newbie trying to get started

$
0
0
Hi. sorry delay replying after you extensive answer. That's really helpful. I was trying to create a table with my filter that only had one value as the combination of month and depot is unique. So actually what i was trying to understand is what table have I created in my filter as clearly not a single value. I don't understand why it isn't a single value so thought i could fiddle around with e the statement until it was. So I guess what I really want to do is some kind of EVALUATE SUMMARIZE CALCULATETABLE expression to see how many rows and columns my FILTER is generating rather than the actual answer?

In this case my PPC table looks like this

Month Depot GPTV
Apr16 BAS 50
Apr16 LIV 75
MAY16 BAS 60
May16 LIV 90

So for the pivot with May16 and BAS on row and column there is only the unique value 60. So if my filter was correct I would have a single cell, but clearly I have more and not sure if I have one month and lots of depot or other way round. Or even all of both!

Thanks
Mike

New Post: Installation with ConfigMgr

$
0
0
Hi Radeck,

Thanks for those. There must be some issue with the system account accessing the registry key at HKCR\Installer\Assemblies\Global as that is what we check to find the current version of the installed dependencies. As a result it's assuming that you don't have these installed at all and is trying to download them. I'm testing a command line parameter now that should skip all the dependency checks entirely.

Regards
Darren

New Post: Newbie trying to get started

$
0
0
Actually, after spotting the obvious error with trying to use VALUES() in the first argument to CALCULATE I missed the fact that you have other VALUES calls inside the FILTER function. I'm not really sure what you are trying to do here. If you are trying simulate a relationship in code then there is an example of that in this article But I'm not entirely sure if that is what you are trying to do.

The problem here is not the detail level, it's the calculation of the grand totals. At the grand total level you will always have multiple values coming back from the two calls to VALUES() So doing an EVALUATE SUMMARIZE won't help here as this is probably more a case that the approach you've used does not cater for totals than issues in the data (although those could be there too)

Can you explain a bit more about what problem it is that you are trying to solve with this formula? And why you can't just do a straight sum of the [GPTV] column?

New Post: A surprise paste and a good one too! (Old news perhaps)

$
0
0
On a hunch I acted accordingly and pasted a DAX measure from DAX studio into the latest version of Power BI Desktop.

If you're in the non-comma-list-separator-part of the world, it seems that Power BI Desktop has caught up and now converts the list separator of ',' into ';' without any further negations on paste.

i.e you can use and run any comma-separated-measure in DAX studio and then copy and paste that measure directly into Power BI desktop without any syntax shenanigans.

New Post: Nulls in a results column when I wouldn't expect there would be any

$
0
0
I have a date table ( 'calendar') joined on each of the tables in the statement below.
My question is, in the Results, why would I see nulls in reportDate if the modelPosition table has a reportDate foreign key to 'calendar'[reportdate]?

Thanks,
Brad
EVALUATE
ADDCOLUMNS(
        Addcolumns(
            SUMMARIZE (   --table
                'modelPositions'
                , 'calendar'[reportDate] 
                , 'Models'[modID]                           
                        )
            , "in" , CALCULATE(sum(modelFlows[inflows]))
            , "out" , CALCULATE(sum(modelFlows[outflows]))          
            , "beg", CALCULATE(sum(modelPositions[beginningBaseValue]) )
            , "end", CALCULATE(sum(modelPositions[BaseValue]) ) 
                )
        , "modTRdaily1", (([end]-[beg]-([in]+[out]))/([beg]+[in]))  
        )

New Post: Nulls in a results column when I wouldn't expect there would be any

$
0
0
Umm... because I didn't refresh the entire model after I added the modelPositions table. Please ignore.

New Post: Newbie trying to get started

$
0
0
Hi

Lets start at the beginning as clearly I've gone off the cliff somehwere.!!!

I am getting weekly data by depot of clicks on our website. I've created a measure that converts these clicks to bookings [Bookings Estimate] based on an average of conversion form the preceding month. The next step is to turn this in money. However i don't get a report of the value of transactions until the EOM so during the month i was just going to use an average value (GPTV=Gross profit ticket value) for each Depot. This is the table above where Apr16 are real numbers and May some made up numbers.

My pivot table has Depot as rows and a slicer for the month of May16. So what i am trying to create is a measure that odes the following:

[Bookings Estimate]*GPTV value from the table where the Depot = Depot on the row and Month = month in the slicer

So essentially I was trying to filter my table by the row and the month slicer to give a single unique value for each combination of depot/month in the pivot. i tried using various versions of CONTAINS without any more success.

Does that make more sense? i was hopnig to use DAX Studio to see the output of the fitler bit to see what table I was generating.
Thanks
Mike

New Post: Writing Large CSV Files

$
0
0
Hi guys

I quite regularly run queries against large PowerBI datasets (300+ million rows) and I notice that DAX Studio doesn't handle extracting data to CSV very well.

It appears to cache the entire data set to memory (ballooning out my memory on a 16G laptop) prior to writing to disk. Is it possible to fix this?

Further, large queries sometimes receive the following error, "Specified cast is not valid". The way around this appears to be setting the table in PBI (during ETL in PQ) to strings.


Thanks,
Simon

New Post: Writing Large CSV Files

$
0
0
I would have thought it would be more efficient to extract data from the original source system. Is there something blocking you from running your extracts from the original source system?

You are right that it currently caches the entire data set in memory, This is what we have to do in order to bind the data to the grid output. The CSV extract is currently just built off the back of the same query engine and just loops through the data set and writes it out to disk. It might be possible for us to stream the data through a DataReader rather than caching it in a DataSet, but the query engine runs asynchronously (so that you can have multiple tabs each running different queries) so it would be a non-trivial change, but we have been thinking about this as it may also allow us to cancel queries part way through, but still show a partial result set.
Further, large queries sometimes receive the following error, "Specified cast is not valid".
This is not something we've seen before it maybe something specific to a give data type, but my guess would be that it's probably related to some specific value in your data. I'll see if it's possible to get any extra information into the error message that will help us diagnose this

New Post: Writing Large CSV Files

$
0
0
Thanks for the quick reply dgosbell!

I'd also like to thank you for designing such an amazing product btw. I use it very regularly.

I think there is value in allowing users to extract large data sets from PBI easily. This allows your average joe to perform one-off ETLs leveraging Power Query instead of other ETL software like SSIS. In fact I believe CSV extraction is a fairly highly voted request on the PBI feedback website.

Just a heads up - my current approach is to set up a linked SQL Server to the PBI Tabular instance and then run a T-SQL command to extract data to csv - it's very reliable but my goodness is it slow!
Viewing all 324 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>