Tag - Business Intelligence Development

POWER BI Implement Row Level Security
Dec 15, 2022

While working on an Embedded Power BI report, I got a requirement to implement Row Level Security.  E.g. Region heads can see data of their Region only. This report is accessible from the .Net MVC application. Locations are assigned to users from the application. This report is hosted on Power BI Server and authenticated by a .Net Application with predefined credentials. Sample Data: There are two tables: Finance Users   Figure 1:-  finance Table   Figure 2:- User   Multiple Options to achieve the same with Power BI We have found out following options to implement the same: RLS (Row Level Security) Through Query String Control report filters that will use in embedded code There are certain limitations with Option#1 and Option#2 (I will describe them later), so moving on with Option#3 “Control report filters”   Implementation with “Control report filters that will use in embedded code” When you embed a Power BI report, you can apply filters automatically during the loading phase, or you can change filters dynamically after the report is loaded. For example, you can create your own custom filter pane and automatically apply those filters to reports to show user-specific insights. You can also create a button that allows users to apply filters to the embedded report.   Step 1: First Identify the filter based on your requirement There are five types of filters available Basic - IBasicFilter Advanced - IAdvancedFilter Top N - ITopNFilter Relative date - IRelativeDateFilter Relative time - IRelativeTimeFilter Based on my requirement I have used a Basic filter and you can use others based on your need. If wants to know more about others click here Here in this blog I am continuing with the #1 basic filter method   Step 2: Determine the table and column that you wants to filter In my case, I want to filter the table ‘finance’ and my column was ‘country’ because when particular user login the country column must be filtered For example:- user G1 login the country = ‘Germany’ and this column in table ‘finance’ Table:- ‘finance’   AND Column:- ‘country’   Step 3: Put this two things in this code   Step 4:  Make Dynamic You can see into this code that there is ‘values’ where I pass “USA” and “Canada” that is hardcode but we want that dynamic values that have to be changed based on which user is login That’s why I made variable that contain ‘country’ name which is assigned to particular login user, For example:- the variable name is ‘Array’ If user G1 login this variable return the value [“Germany”] and set this variable to value Like:   Note: if login person has multiple locations than that variable should return values like [“Germany”, ”USA”, ”India”] Note: if login person is manger or CEO they can see all the country data for that Variable has to return Null instead of blank array [ ]   Step 5:  Put This code into embedded code. Step 5.1: Identify your Method User-owns-data( Embed For your Organization) App-owns-data (Embed for your customer) Step 5.2: According to your method put this code in the given place For the #1 method Add this code into [EmbedReport.aspx] file   Figure 3: User-owns-data   For the #2 method Add this code into [EmbedeReport.cshtml] file Figure 4: App-owns-data   Figure 5: Sample   And that’s it. Now, try to run the report from the application and it should work as expected. Feel free to reach out to me, if you face any issues.   Conclusion I Hope I’ve Shown You How Easy It Is To Implement Row Level Security, So far we have learned step by step process of it. We started with identifying the filter and then finding out the actual table and column that we want to filter on and generating the code. last but not least put this code into our embedded code.

Create SSIS Data Flow Task Package Programmatically
Jul 27, 2020

In this article, we will review how to create a data flow task package of SSIS in Console Application with an example. Requirements Microsoft Visual Studio 2017 SQL Server 2014 SSDT Article  Done with the above requirements? Let's start by launching Microsoft Visual Studio 2017. Create a new Console Project with .Net Core.  After creating a new project, provide a proper name for it. In Project Explorer import relevant references and ensure that you have declared namespaces as below: using Microsoft.SqlServer.Dts.Pipeline.Wrapper; using Microsoft.SqlServer.Dts.Runtime; using RuntimeWrapper = Microsoft.SqlServer.Dts.Runtime.Wrapper;   To import above namespaces we need to import below refrences.   We need to keep in mind that, above all references should have same version.   After importing namespaces, ask user for the source connection string, destination connection string and table that will be copied to destination. string sourceConnectionString, destinationConnectionString, tableName; Console.Write("Enter Source Database Connection String: "); sourceConnectionString = Console.ReadLine(); Console.Write("Enter Destination Database Connection String: "); destinationConnectionString = Console.ReadLine(); Console.Write("Enter Table Name: "); tableName = Console.ReadLine();   After Declaration, create instance of Application and Package. Application app = new Application(); Package Mipk = new Package(); Mipk.Name = "DatabaseToDatabase";   Create OLEDB Source Connection Manager to the package. ConnectionManager connSource; connSource = Mipk.Connections.Add("ADO.NET:SQL"); connSource.ConnectionString = sourceConnectionString; connSource.Name = "ADO NET DB Source Connection";   Create OLEDB Destination Connection Manager to the package. ConnectionManager connDestination; connDestination= Mipk.Connections.Add("ADO.NET:SQL"); connDestination.ConnectionString = destinationConnectionString; connDestination.Name = "ADO NET DB Destination Connection";   Insert a data flow task to the package. Executable e = Mipk.Executables.Add("STOCK:PipelineTask"); TaskHost thMainPipe = (TaskHost)e; thMainPipe.Name = "DFT Database To Database"; MainPipe df = thMainPipe.InnerObject as MainPipe;   Assign OLEDB Source Component to the Data Flow Task. IDTSComponentMetaData100 conexionAOrigen = df.ComponentMetaDataCollection.New(); conexionAOrigen.ComponentClassID = "Microsoft.SqlServer.Dts.Pipeline.DataReaderSourceAdapter, Microsoft.SqlServer.ADONETSrc, Version=14.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91"; conexionAOrigen.Name = "ADO NET Source";   Get Design time instance of the component and initialize it. CManagedComponentWrapper instance = conexionAOrigen.Instantiate(); instance.ProvideComponentProperties();   Specify the Connection Manager. conexionAOrigen.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.GetExtendedInterface(connSource); conexionAOrigen.RuntimeConnectionCollection[0].ConnectionManagerID = connSource.ID;   Set the custom properties. instance.SetComponentProperty("AccessMode", 0); instance.SetComponentProperty("TableOrViewName", "\"dbo\".\"" + tableName + "\"");   Reinitialize the source metadata. instance.AcquireConnections(null); instance.ReinitializeMetaData(); instance.ReleaseConnections();   Now, Add Destination Component to the Data Flow Task. IDTSComponentMetaData100 conexionADestination = df.ComponentMetaDataCollection.New(); conexionADestination.ComponentClassID = "Microsoft.SqlServer.Dts.Pipeline.ADONETDestination, Microsoft.SqlServer.ADONETDest, Version=14.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91"; conexionADestination.Name = "ADO NET Destination";   Get Design time instance of the component and initialize it. CManagedComponentWrapper instanceDest = conexionADestination.Instantiate(); instanceDest.ProvideComponentProperties();   Specify the Connection Manager. conexionADestination.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.GetExtendedInterface(connDestination); conexionADestination.RuntimeConnectionCollection[0].ConnectionManagerID = connDestination.ID;   Set the custom properties. instanceDest.SetComponentProperty("TableOrViewName", "\"dbo\".\"" + tableName + "\"");   Connect the source to destination component: IDTSPath100 union = df.PathCollection.New(); union.AttachPathAndPropagateNotifications(conexionAOrigen.OutputCollection[0], conexionADestination.InputCollection[0]);   Reinitialize the destination metadata. instanceDest.AcquireConnections(null); instanceDest.ReinitializeMetaData(); instanceDest.ReleaseConnections();   Map Source input Columns and Destination Columns foreach (IDTSOutputColumn100 col in conexionAOrigen.OutputCollection[0].OutputColumnCollection) {     for (int i = 0; i < conexionADestination.InputCollection[0].ExternalMetadataColumnCollection.Count; i++)     {         string c = conexionADestination.InputCollection[0].ExternalMetadataColumnCollection[i].Name;         if (c.ToUpper() == col.Name.ToUpper())         {             IDTSInputColumn100 column = conexionADestination.InputCollection[0].InputColumnCollection.New();             column.LineageID = col.ID;             column.ExternalMetadataColumnID = conexionADestination.InputCollection[0].ExternalMetadataColumnCollection[i].ID;         }     } }   Save Package into the file system. app.SaveToXml(@"D:\Workspace\SSIS\Test_DB_To_DB.dtsx", Mipk, null);   Execute package. Mipk.Execute(); Conclusion In this article, we have explained one of the alternatives for creating SSIS packages using .NET console application. In case you have any questions, please feel free to ask in the comment section below.   RELATED BLOGS: Basics of SSIS(SQL Server Integration Service)

What is meant by MSBI?
Oct 16, 2019

MSBI stands for Microsoft Business Intelligence. This powerful suite is composed of tools that help in providing the best solutions for Business Intelligence and Data Mining Queries. This tool uses a Visual studio along with an SQL server. It empowers users to gain access to accurate and up-to-date information for better decision making in an organization. It offers different tools for different processes that are required in Business Intelligence (BI) solutions. MSBI is divided into 3 categories: SSIS – SQL Server Integration Services – Integration tool. This tool is used for integration like duping the data from one database to another like from Oracle to SQL Server or from Excel to SQL Server etc. This tool is also used for bulk transactions in the database like inserting lacs of records at once. We can create the integration services modules which will do the job for us.   SSAS – SQL Server Analytical Services -Analysis tool. This tool is used to analyze the performance of the SQL Server in terms of load balancing, heavy data, transaction, etc. So it is more or less related to the administration of the SQL Server using this tool. This is a very powerful tool and through this, we can analyze the data inserting into the database like how many transactions happen in a second, etc.   SSRS – SQL Server Reporting Services – Reporting tool. This is a very efficient tool as it is platform-independent. We can generate the report using this tool and can use it in any type of application. Nowadays this is very popular in the market.   “A visual always helps better to understand any concept.” The above Diagram broadly defines “Microsoft Business Intelligence (MSBI)”   RELATED BLOGS: POWER BI Report Integration