Power BI and business application platform

Creation translation system of labels with Power BI

In an Power BI Desktop data models, you can’t do multiple translations of a caption to provide a culture-specific. Today we will see For how perform translations Power BI models.
A translation consists of a translated caption for the name of fields in the table a binding to a column that provides data values in the target language. You can have multiple translations. There is no theoretical limit on the number of translations you can embed in model.
To perform this using:

In this post we get data from three csv file: product, sales territory  and sales from AdventureWorks. We will create a model in the file model.xlsx and we will create the dictionary in the Dictionary.xlsx. See below the file in the folder

So, we have a series of label, and want to look them up in this table (lookup table) in the dictionary.xlsx with two columns one for it-IT label and one for en-US label, see below

Now we will import into Power Query into model.xlsx file then territory’s lookup table

Now we will import into Power Query into model.xlsx file then product’s table from SalesTerritory.csv file. The objective is change the label with a correct translation for examples SalesTerritoryRegion change in Region or in Italian language “Regione“. After importing it we will can observed in power query the following structure:

Now we will create a query for perform the translation. The M formulas is:

    Source = Csv.Document(File.Contents(PathFile & "\SalesTerritory.csv"),[Delimiter=",", Columns=6, Encoding=1252, QuoteStyle=QuoteStyle.None]),
    #"Removed Columns2" = Table.RemoveColumns(Source,{"Column2", "Column6"}),
    #"Renamed Columns" = Table.RenameColumns(#"Removed Columns2",{{"Column3", "Column2"}, {"Column4", "Column3"}, {"Column5", "Column4"}}),
    #"Kept First Rows" = Table.FirstN(#"Renamed Columns",1),
    #"Removed Top Rows" = Table.Skip(#"Renamed Columns",1),
    #"Changed Type" = Table.TransformColumnTypes(#"Removed Top Rows",{{"Column1", type number}}),
    #"Transposed Table" = Table.Transpose(#"Kept First Rows"),
    #"Invoked Custom Function" = Table.AddColumn(#"Transposed Table", "lkpLabel", each lkpLabel([Column1], lbTerritory, 2, false)),
    #"Removed Columns" = Table.RemoveColumns(#"Invoked Custom Function",{"Column1"}),
    #"Transposed Table1" = Table.Transpose(#"Removed Columns"),
    #"Table Combine" = Table.Combine({#"Transposed Table1",#"Removed Top Rows"}),
    #"Promoted Headers" = Table.PromoteHeaders( #"Table Combine", [PromoteAllScalars=true])
    #"Promoted Headers"

Begin analyzing the formula from row 4 and 5. In this two row set two variables “Kept First Rows” e “Removed Top Rows”  the first contains the first line where there are labels that I have to replace

the second contains the all line that they will need later

The First result (“Kept First Rows“) is now ready to be transposed using the Table.Transpose() function

Here is the key step where the call the lkpLabel  function for create a colum with the new translate label. This function use 4 parameters:

    #"Invoked Custom Function" = Table.AddColumn(#"Transposed Table", "lkpLabel", each lkpLabel([Column1], lbTerritory, 2, false))
  • Lookup_value  => the value of the column which must be converted ([Column1])
  • Table array => table array is the lookup table (lbTerritory)
  • column = >number of column for lookup
  • approximate_match => it’s optional

and see bleow the new column (lkpLabel):

Once we got the new value, transpose the table again to restore the headers as a row

So now remains to connect the new header with the body of the table that contains all rows and that we previously saved to a variable “Removed Top Rows” . The table with the single header row needs to be combined with the original table with Table.Combine function

    #"Table Combine" = Table.Combine({#"Transposed Table1",#"Removed Top Rows"})

Once the tables are combined, you can see the altered headers to promote the first row to headers using the Table.PromoteHeaders() function.

   #"Promoted Headers" = Table.PromoteHeaders( #"Table Combine", [PromoteAllScalars=true])

you can download the folder with all of the demo file from this link.

Optimizing DAX

SQL Saturday 589 Interview

Tweet SQL Saturday 589 with Power BI

Basic Data Modelling with DAX by Marco Pozzan on Parma’s Univeristy

Power BI: basic data modelling with DAX by Marco Pozzan from UGISS on Vimeo.

Cloud Conference 2016 My video on Power BI + Flow

My new video on Dax Fundamentals

SSRS Branding File Editor

A new feature of branding has been introduced for SSRS in SQL Server 2016. This feature is useful for customizing the SSRS Report Manager Portal with Display colors and a logo also can be added to the portal.

The tool is aviable on http://ssrsbrandingfileeditor.codeplex.com/

Performance Issues with Number of Virtual Log Files In SQL Server


Each transaction log file is made of smaller parts, known as virtual log files. The number of virtual log files per transaction log files is not fixed. Its size is determined dynamically when the transaction log file is executed. The number of virtual log files cannot be set or configured by the database administrator. If the auto-growth settings are not properly managed then, database can be forced to auto-grow that can cause serious performance issues. In the following section, we will discuss the causes and number of virtual log files can exist.

Impact of VLF on SQL Server Performance

In SQL Server, Transaction log files are set at 2MB at initial size. In addition, 10% of current size is the default growth value. At the time of creating SQL Server database, these options can be modified to accommodate the needs of database. The auto-growth option is optional which is turned on by default. SQL Server creates a database with unrestricted file growth. If the settings are not properly managed, then it will create an issue. Until, the auto-growth is finished, the server will stop all the processing. The auto-growth will take up the space that is not physically close to previous one due to physical organization of hard drive. It leads to physical fragmentation that causes the slower response in performance.

The Server should not have an excessive number of virtual log file inside the translation log. Large number of small virtual log file slows down the process of recovery, which database goes through on startup otherwise after restoring a backup. The threshold for significantly affecting the recovery performance appears to be around ten thousand virtual log files. When there are about hundred thousand virtual log files then, symptoms become significantly noticed.

Tips to Fix this Perfomance Issue:

  • One can determine the no. of VLFs in specific database by checking the no. of records that are returned as resulted of the executed DBCC command within the text by using DBCC LOGINFO
  • The number of virtual log files can be decreased by running the DBCC SHRINKFILE command.


In the discussion, performance issue with large number of Virtual Log Files in SQL Server Transaction is discussed. It provides guidance for users to have proper understanding about the virtual log file impact on SQL Server that results in slow down of the performance of the server.

Switch to our mobile site