Latest Blogs

Easy jQuery Trick
Aug 10, 2020

1. Back to top button $('a.top').click(function() { $(document.body).animate({scrollTop : 0},800); return false; }); //Create an anchor tag <a class=”top” href=”#”>Back to top</a> As you can see using the animate and scrollTop functions in jQuery we don’t need a plugin to create a simple scroll to top animation. By changing the scrollTop value we can change where we want the scrollbar to land, in my case I used a value of 0 because I want it to go to the very top of our page, but if I wanted an offset of 100px I could just type 100px in the function. So all we are really doing is animating the body of our document throughout the course of 800ms until it scrolls all the way to the top of the document. 2. Checking if images are loaded $(‘img’).load(function() { console.log(‘image load successful’); }); Sometimes you need to check if your images are fully loaded in order to continue with your scripts, this three line jQuery snippet can do that for you easily. You can also check if one particular image has loaded by replacing the img tag with an ID or class.   3. Fix broken images automatically $('img').error(function(){ $(this).attr('src', ‘img/broken.png’); }); Occasionally we have times when we have broken image links on our website and replacing them one by one isn’t easy, so adding this simple piece of code can save you a lot of headaches. Even if you don’t have any broken links adding this doesn’t do any harm.   4. Toggle class on hover $(‘.btn').hover(function(){ $(this).addClass(‘hover’); }, function(){ $(this).removeClass(‘hover’); }); We usually want to change the visual of a clickable element on our page when the user hovers over and this jQuery snippet does just that, it adds a class to your element when the user is hovering and when the user stops it removes the class, so all you need to do is add the necessary styles in your CSS file.   5. Disabling input fields $('input[type="submit"]').attr("disabled", true); On occasion you may want the submit button of a form or even one of its text inputs to be disabled until the user has performed a certain action (checking the “I’ve read the terms” checkbox for example) and this line of code accomplishes that; it adds the disabled attribute to your input so you can enable it when you want to. To do that all you need to do is run the removeAttr function on the input with disabled as the parameter: $('input[type="submit"]').removeAttr("disabled”);   6. Stop the loading of links $(‘a.no-link').click(function(e){ e.preventDefault(); }); Sometimes we don’t want links to go to a certain page or even reload it, we want them to do something else like trigger some other script and in that case this piece of code will do the trick of preventing the default action.   7. Toggle fade/slide // Fade $( “.btn" ).click(function() { $( “.element" ).fadeToggle("slow"); }); // Toggle $( “.btn" ).click(function() { $( “.element" ).slideToggle("slow"); }); Slides and Fades are something we use plenty in our animations using jQuery, sometimes we just want to show an element when we click something and for that the fadeIn and slideDown methods are perfect, but if we want that element to appear on the first click and then disappear on the second this piece of code will work just fine.   8. Make two divs the same height $(‘.div').css('min-height', $(‘.main-div').height());   9. Self-closing elements When inserting empty elements into the DOM, you can use self-closing tags: $('#el1').append('<table class="holly" />'); $('#el2').before('<p class="user-message note" />');   10. Inserting content —Before, after, prepend or append? When inserting elements into an element, use .prepend() or .append(). The difference is that prepend inserts the new element into the start (i.e. insert element as the first child), while append does the exact opposite, inserting the element as the last child: $('div').prepend('<p>Hello!</p>').append('<p>Goodbye!</p>');   11. Loopy doopy do! Very often people wonder how to loop through each element individually, and then target the current element. For this purpose, we can use .each() and $(this). Let’s say we want to add a class to all paragraphs, based on their index. If you are working on a static page, you might be tempted to hardcode it: $('p').each(function (i) {     $(this).addClass('para-' + (i+1)); } Remember that i is a zero-based index, i.e. it starts from 0 for the first encountered element.   12. Automatic vendor prefixes jQuery automatically adds vendor prefixes when it sees fit — therefore saving you the headache of adding vendor prefixes yourself: $('.box').css({ 'transform': 'translate(100,100) scale(2)' }); 13. Selectors As previously mentioned, one of the core concepts of jQuery is to select elements and perform an action. jQuery has done a great job of making the task of selecting an element, or elements, extremely easy by mimicking that of CSS. On top of the general CSS selectors, jQuery has support for all of the unique CSS3 selectors, which work regardless of which browser is being used. $('.feature');           // Class selector $('li strong');          // Descendant selector $('em, i');              // Multiple selector $('a[target="_blank"]'); // Attribute selector $('p:nth-child(2)');     // Pseudo-class selector 14. 'this' Selection Keyword When working inside of a jQuery function you may want to select the element in which was referenced inside of the original selector. In this event the this keyword may be used to refer to the element selected in the current handler. $('div').click(function(event){    $(this); });   15. jQuery Selection Filters Should CSS selectors not be enough there are also custom filters built into jQuery to help out. These filters are an extension to CSS3 and provide more control over selecting an element or its relatives. $('div:has(strong)');   16. Slide Demo //HTML <div class="panel">   <div class="panel-stage"></div>   <a href="#" class="panel-tab">Open <span>&#9660;</span></a> </div>   //Function $('.panel-tab').on('click', function(event){   event.preventDefault();   $('.panel-stage').slideToggle('slow', function(event){     if($(this).is(':visible')){       $('.panel-tab').html('Close <span>&#9650;</span>');     } else {       $('.panel-tab').html('Open <span>&#9660;</span>');     }   }); });   17. Tabs Demo //HTML <ul class="tabs-nav">   <li><a href="#tab-1">Features</a></li>   <li><a href="#tab-2">Details</a></li> </ul> <div class="tabs-stage">   <div id="tab-1">...</div>   <div id="tab-2">...</div> </div> // Show the first tab by default $('.tabs-stage div').hide(); $('.tabs-stage div:first').show(); $('.tabs-nav li:first').addClass('tab-active'); // Change tab class and display content $('.tabs-nav a').on('click', function(event){   event.preventDefault();   $('.tabs-nav li').removeClass('tab-active');   $(this).parent().addClass('tab-active');   $('.tabs-stage div').hide();   $($(this).attr('href')).show(); });

POWER BI Report Integration
Aug 04, 2020

Here we will see how we can access Power BI Embed reports and Dashboards in ASP.NET MVC applications. Let’s start with a small introduction to Power BI. Power BI is a reporting tool provided by Microsoft. It is accessible in two forms windows version and web version. Using windows version, we can design reports and then publish it to the Power BI server and Web version Power BI can be used to view that Published report. Power BI server URL: https://powerbi.microsoft.com/en-us/   Prerequisite to Start with Power BI: Visual Studio 2017 (https://visualstudio.microsoft.com/vs/older-downloads/) Power BI Desktop (https://powerbi.microsoft.com/en-us/downloads/) Azure subscription to register the application. (http://portal.azure.com/) Please follow the below steps to set up the Environment before starting with integration. Create an Active Directory on AZURE portal Register Application on Azure Portal. Note: Copy these details to somewhere User Id, Password, ClientID, Client Secret Key Grant All Access Permissions to application. Later on, you can remove extra permissions as per requirement. Register on the Power BI server to host reports and create the workspace. Create one sample report and Deploy on the Workspace we create on power BI server We can access this report directly without any authorization with the help of the URL. In that case, there is no data security. That’s why we are using Power BI Embed, and this report will be loaded with the help of Embedded URL and We’ll use Authorization Token for Authentication.   Power Integration Steps Generate Bearer token for Authenticate your application with Azure Active Directory with help of User Id, Password, ClientID, Client Secret key of Azure portal. We are going to generate this with the help of Postman. This bearer token will be used to access all the APIs related to Power BI. I.E. Get Workspace list inside Active Directory, Get Workspace items list, etc. API URL for generating Bearer token: https://login.microsoftonline.com/common/oauth2/token    Now, we have Bearer Token. We’ll get Workspace (Groups) list with the help of that bearer token. Below API will provide Workspace Details. Like Workspace Name, id, IsReadOnly, etc. API URL:  https://api.powerbi.com/v1.0/myorg/groups/   Now we have bearer Token and group details. Based on these all details. We’ll get report details which we hosted for integration.  We have different API methods for fetch report details, which will be used to integrate with our application. It will provide report id, name, report type, Web URL, Embed URL, Dataset Id, etc. Web URL: It can be accessed without authentication, which is open for all. You can access reports by browsing the URL in the browser directly. Embed URL: It is Secure URL which we are going to use for secure integration purpose. API URL: https://api.powerbi.com/v1.0/myorg/groups/{GroupID}/reports   Now we have all the required details to access the Power BI report. But still, we cannot access the report. It will require an Authentication token for that specific report to access it.  So, we’ll generate a Token for it. API URL:   https://api.powerbi.com/v1.0/myorg/groups/{GroupID}/reports/{ReportId}/generatetoken     We’ll Use Embed URL and Report Authentication Token to integrate a report with Existing Application.   Create a sample MVC application Create All the API methods as explained in step 1 to 4 Add below code in the controller   Add the below code in the view. Now, Run the project it will display sample reports in a web browser. Conclusion: After reading the blog, We have learned about Power BI API and the integration of those APIs into ASP.NET MVC application.

Create SSIS Data Flow Task Package Programmatically
Jul 27, 2020

In this article, we will review how to create a data flow task package of SSIS in Console Application with example. Requirements Microsoft Visual Studio 2017 SQL Server 2014 SSDT Article  Done with the above requirements? Lets start by launching Microsoft Visual Studio 2017. Create a new Console Project with .Net Core.  After created new project provide proper name to it. In Project Explorer import relevant references and ensure that you have declared namespaces as below: using Microsoft.SqlServer.Dts.Pipeline.Wrapper; using Microsoft.SqlServer.Dts.Runtime; using RuntimeWrapper = Microsoft.SqlServer.Dts.Runtime.Wrapper;   To import above namespaces we need to import below refrences.   We need to keep in mind that, above all references should have same version.   After importing namespaces, ask user for the source connection string, destination connection string and table that will be copied to destination. string sourceConnectionString, destinationConnectionString, tableName; Console.Write("Enter Source Database Connection String: "); sourceConnectionString = Console.ReadLine(); Console.Write("Enter Destination Database Connection String: "); destinationConnectionString = Console.ReadLine(); Console.Write("Enter Table Name: "); tableName = Console.ReadLine();   After Declaration, create instance of Application and Package. Application app = new Application(); Package Mipk = new Package(); Mipk.Name = "DatabaseToDatabase";   Create OLEDB Source Connection Manager to the package. ConnectionManager connSource; connSource = Mipk.Connections.Add("ADO.NET:SQL"); connSource.ConnectionString = sourceConnectionString; connSource.Name = "ADO NET DB Source Connection";   Create OLEDB Destination Connection Manager to the package. ConnectionManager connDestination; connDestination= Mipk.Connections.Add("ADO.NET:SQL"); connDestination.ConnectionString = destinationConnectionString; connDestination.Name = "ADO NET DB Destination Connection";   Insert a data flow task to the package. Executable e = Mipk.Executables.Add("STOCK:PipelineTask"); TaskHost thMainPipe = (TaskHost)e; thMainPipe.Name = "DFT Database To Database"; MainPipe df = thMainPipe.InnerObject as MainPipe;   Assign OLEDB Source Component to the Data Flow Task. IDTSComponentMetaData100 conexionAOrigen = df.ComponentMetaDataCollection.New(); conexionAOrigen.ComponentClassID = "Microsoft.SqlServer.Dts.Pipeline.DataReaderSourceAdapter, Microsoft.SqlServer.ADONETSrc, Version=14.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91"; conexionAOrigen.Name = "ADO NET Source";   Get Design time instance of the component and initialize it. CManagedComponentWrapper instance = conexionAOrigen.Instantiate(); instance.ProvideComponentProperties();   Specify the Connection Manager. conexionAOrigen.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.GetExtendedInterface(connSource); conexionAOrigen.RuntimeConnectionCollection[0].ConnectionManagerID = connSource.ID;   Set the custom properties. instance.SetComponentProperty("AccessMode", 0); instance.SetComponentProperty("TableOrViewName", "\"dbo\".\"" + tableName + "\"");   Reinitialize the source metadata. instance.AcquireConnections(null); instance.ReinitializeMetaData(); instance.ReleaseConnections();   Now, Add Destination Component to the Data Flow Task. IDTSComponentMetaData100 conexionADestination = df.ComponentMetaDataCollection.New(); conexionADestination.ComponentClassID = "Microsoft.SqlServer.Dts.Pipeline.ADONETDestination, Microsoft.SqlServer.ADONETDest, Version=14.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91"; conexionADestination.Name = "ADO NET Destination";   Get Design time instance of the component and initialize it. CManagedComponentWrapper instanceDest = conexionADestination.Instantiate(); instanceDest.ProvideComponentProperties();   Specify the Connection Manager. conexionADestination.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.GetExtendedInterface(connDestination); conexionADestination.RuntimeConnectionCollection[0].ConnectionManagerID = connDestination.ID;   Set the custom properties. instanceDest.SetComponentProperty("TableOrViewName", "\"dbo\".\"" + tableName + "\"");   Connect the source to destination component: IDTSPath100 union = df.PathCollection.New(); union.AttachPathAndPropagateNotifications(conexionAOrigen.OutputCollection[0], conexionADestination.InputCollection[0]);   Reinitialize the destination metadata. instanceDest.AcquireConnections(null); instanceDest.ReinitializeMetaData(); instanceDest.ReleaseConnections();   Map Source input Columns and Destination Columns foreach (IDTSOutputColumn100 col in conexionAOrigen.OutputCollection[0].OutputColumnCollection) {     for (int i = 0; i < conexionADestination.InputCollection[0].ExternalMetadataColumnCollection.Count; i++)     {         string c = conexionADestination.InputCollection[0].ExternalMetadataColumnCollection[i].Name;         if (c.ToUpper() == col.Name.ToUpper())         {             IDTSInputColumn100 column = conexionADestination.InputCollection[0].InputColumnCollection.New();             column.LineageID = col.ID;             column.ExternalMetadataColumnID = conexionADestination.InputCollection[0].ExternalMetadataColumnCollection[i].ID;         }     } }   Save Package into the file system. app.SaveToXml(@"D:\Workspace\SSIS\Test_DB_To_DB.dtsx", Mipk, null);   Execute package. Mipk.Execute(); Conclusion In this article, we have explained one of the alternatives for creating SSIS packages using .NET console application. In case you have any questions, please feel free to ask in the comment section below.

DELETE and UPDATE CASCADE in SQL Server foreign key
Jul 20, 2020

In this article, we will review on DELETE AND UPDATE CASCADE rules in SQL Server foreign key with different examples. DELETE CASCADE: When we create a foreign key using this option, it deletes the referencing rows in the child table when the referenced row is deleted in the parent table which has a primary key. UPDATE CASCADE: When we create a foreign key using UPDATE CASCADE the referencing rows are updated in the child table when the referenced row is updated in the parent table which has a primary key. We will be discussing the following topics in this article: Creating DELETE CASCADE and UPDATE CASCADE rule in a foreign key using T-SQL script Triggers on a table with DELETE or UPDATE cascading foreign key Let us see how to create a foreign key with DELETE and UPDATE CASCADE rules along with few examples.   Creating a foreign key with DELETE and UPDATE CASCADE rules Please refer to the below T-SQL script which creates a parent, child table and a foreign key on the child table with DELETE CASCADE rule.   Insert some sample data using below T-SQL script.   Now, Check Records.   Now I deleted a row in the parent table with CountryID =1 which also deletes the rows in the child table which has CountryID =1.   Please refer to the below T-SQL script to create a foreign key with UPDATE CASCADE rule.   Now update CountryID in the Countries for a row which also updates the referencing rows in the child table States.   Following is the T-SQL script which creates a foreign key with cascade as UPDATE and DELETE rules. To know the update and delete actions in the foreign key, query sys.foreign_keys view. Replace the constraint name in the script.   The below image shows that a DELETE CASCADE action and UPDATE CASCADE action is defined on the foreign key. Let’s move forward and check the behavior of delete and update rules the foreign keys on a child table which acts as parent table to another child table. The below example demonstrates this scenario. In this case, “Countries” is the parent table of the “States” table and the “States” table is the parent table of Cities table.   We will create a foreign key now with cascade as delete rule on States table which references to CountryID in parent table Countries.   Now on the Cities table, create a foreign key without a DELETE CASCADE rule. If we try to delete a record with CountryID = 3, it will throw an error as delete on parent table “Countries” tries to delete the referencing rows in the child table States. But on Cities table, we have a foreign key constraint with no action for delete and the referenced value still exists in the table.   The delete fails at the second foreign key.   When we create the second foreign key with cascade as delete rule then the above delete command runs successfully by deleting records in the child table “States” which in turn deletes records in the second child table “Cities”.   Triggers on a table with delete cascade or update cascade foreign key An instead of an update trigger cannot be created on the table if a foreign key on with UPDATE CASCADE already exists on the table. It throws an error “Cannot create INSTEAD OF DELETE or INSTEAD OF UPDATE TRIGGER ‘trigger name’ on table ‘table name’. This is because the table has a FOREIGN KEY with cascading DELETE or UPDATE.” Similarly, we cannot create INSTEAD OF DELETE trigger on the table when a foreign key CASCADE DELETE rule already exists on the table.   Conclusion In this article, we explored a few examples on DELETE CASCADE and UPDATE CASCADE rules in SQL Server foreign key. In case you have any questions, please feel free to ask in the comment section below.

Terraform Infrastructure as a Code
Jul 13, 2020

You may have heard infrastructure as code(IaC), But do you know what infrastructure is? Why do we need infrastructure as code? What are the benefits of infrastructure as code? Is it safe and secure?    What is Infrastructure as Code(IoC)? Infrastructure as code (IaC) means to manage and upgrade your environments as infrastructure using configuration files. Terraform provides infrastructure as code for provisioning, compliance, and management across any public cloud, private data center, and third-party service. Enables teams to write, share, manage, and automate any infrastructure using version control With automated policy enforcement for security, compliance, and operational best practices and Enable developers to provision their desired infrastructure from within their workflows. IOC has a high impact on the Business perspective by providing Increased Productivity, Reduced Risk, Reduced Cost   Why do we use Infrastructure as Code(IoC)? Terraform is a simple human-readable configuration language, to define the desired topology of infrastructure resources VCS Integration Write, version, review, and collaborate on Terraform code using your preferred version control system Workspaces Workspaces decompose monolithic infrastructure into smaller components, or "micro-infrastructures". These workspaces can be aligned to teams for role-based access control. Variables Granular variables allow easy reuse of code and enable dynamic changes to scale resources and deploy new versions. Runs Terraform uses two-phased provisioning a plan (dry run) & apply (execution). Plans can be inspected before execution to ensure expected behavior and safety. Infrastructure State The state file is a record of currently provisioned resources. State files enable a versioned history of the infrastructure and are encrypted at rest. Versions can be inspected to see incremental changes. Policy as Code Sentinel is a policy as a code framework to automate multi-cloud governance.   What are the benefits of Infrastructure as Code(IoC)? Infrastructure as Code enables Infrastructure teams to test the applications in staging environments or development environment early - likely in the development cycle Infrastructure as Code Saves You Time and Money We can have a version history like when the infrastructure is upgraded and who has done it from the code itself. Else we have to ask to check the Infrastructure admin to look into logs and which is very time-consuming. We can check it into version control and I get versioning. Now we can see an incremental history of who changed what Use Infrastructure as Code to build update and manage any cloud, infrastructure, or services Terraform makes it easy to re-use configurations for the environment for similar infrastructure, helping you avoid mistakes and save time. We can use the same configuration code for the different staging Production and development environments. Terraform supports many Providers to be built from just a simple and less line of code. Major providers are as follows AWS Azure GitHub GitLab Google Cloud Platform VMWare Docker  and  200+ more. A Simple example to create an Ec2 Instance with just a few lines of code. resource "aws_instance" "ec2_instance" {   ami = "ami-*******"   instance_type = "t2.micro"   vpc_security_group_ids = ["${aws_security_group.*****.id}"]   key_name = "${aws_key_pair.****.id}"   tags {     Name = "New-EC2-Instance"   } } But First, we have to write code for which provider we are writing our code. To do so  here is the simple basic code to assign a provider provider "aws" {   region = "us-west-2"   ## PROVIDE CREDENTIALS } Now to Create your Ec2 Instance in AWS. We have to run the commands. So terraform has Four commands to check and apply the infrastructure changes, Init Plan Apply Destroy.   1. Init $ terraform init We can understand from the name of the command that is used to initialize something. So here terraform will be initialized in our code. which will create some basic backend and tfstate files in folders for internal use. 2. Plan $ terraform plan As we do compile in some code languages, it will check for the compilation errors and plan what is going to happen when we run the script to generate infrastructure code. It will show you what resources are going to be created and what will be the configuration. 3. Apply $ terraform apply It is time to run the script and check what is being generated from the scripts. So the command will execute the script and apply the changes in our infrastructure, which will generate some resources for what we have written in the code.  4. Destroy $ terraform destroy This command is used when we want to remove or destroy the resource. After some time we don't need that resource then we just run the command which will destroy the resource. And your money is saved.

Highlights of Upcoming .NET 5
Jul 06, 2020

Microsoft Announces .NET 5 Microsoft announces that the next release after .Net core 3.0 will be .NET 5. .NET 5 will be the big release in the .NET family. You will able to use it to target Windows, Linux, Android, iOS, macOS, tvOS, watchOS, WebAssembly, and more. Microsoft will introduce new .NET APIs, language features, and runtime capabilities as a part of .NET 5. Microsoft intends to release .NET 5 in November 2020, with the first preview available in the first half of 2020. It will be supported with future updates to Visual Studio 2019, Visual Studio for Mac, and Visual Studio Code. .Net 5 Moves Ahead with .NET Core   .NET Core Functionalities: Cross-platform implementation in any device Support all key platform features for .net core, .Net Framework, xamarin Open-source and community-Oriented Support with future updates to Visual Studio Code, Visual Studio 2019, Command Line Interface, and Visual Studio for Mac. Fast, Scalable and High Performance Side-by-side installation Support for platform-specific features like Windows Forms, & WPF on Windows Smarter Deployment & packages Small projects Files   Three New major supports for developers Java Interoperability will be available on all  Swift and Objective-C Interoperability will be supported  on all multiple operating systems CoreFX will be extended to support static compilation of .Net, smaller footprints, and support for more operating systems.   The Other Highlights Features   Desktop Development with .NET 5 .NET 5 will come up with all key desktop development functionalities and libraries. WPF, Windows Forms, UWP, and xamarin are the four key desktop platforms.   Mobile Development with .NET .Net 5 will continue to build cross-platform mobile apps for Android, iOS, tvOS, macOS, and watchOS platforms using Xamarin.   Runtime and Language with .Net 5 Mono is the original cross-platform implementation of .NET. It started out as an open-source alternative to  .NET Framework and transitioned to targeting mobile devices like Android and iOS devices popular. CoreCLR is runtime used as part of .NET Core. It has been primarily targeted at supporting cloud applications, including the largest services at Microsoft and now is also being used for windows desktop, IOT, and machine learning applications.   Cloud Development with .NET 5 The major functionality of .Net 5 is Azure app Development. With the release of the latest version of .Net, developers will continue to develop software with Azure.   Game Development with .Net 5 Visual studio 2019 and .Net 5 will support utility, a vital part of .Net gaming to develop games for mobile and other gaming platforms. 

New Features and Highlights of Bootstrap 5
Jun 29, 2020

The Bootstrap 5 alpha version has been released on June 16, 2020. You can check the official page for Bootstrap 5 alpha. Bootstrap is a free and open-source most popular CSS framework for website and web applications. It contains CSS- and (optionally) JavaScript-based design templates for typography, forms, buttons, navigation, and other interface components.HTML- and CSS-based design templates for different components of a website or application such as models, forms, buttons, navigation, accordion, tabs, typography and other interface components with helpful JavaScript extensions. Bootstrap is a powerful tool for whatever type of website and web application you are trying to build. The following are some of the expected changes in Bootstrap 5: jQuery was removed As we all know, Bootstrap has made this decision long before. The team had opened a pull request in 2017 aiming to remove jQuery entirely from bootstrap versions, and now in bootstrap 5 alpha it is done and replaced entirely.   Switch to Vanilla JavaScript JavaScript is currently dominating the modern web development community. Almost all modern web browsers on consoles, desktops, games, tablets, and mobile phones include JavaScript interpreters. Therefore, it's geared up to become the universal scripting standard of the global.  As a result, this will improve the size and weight of the files and libraries used by the framework.   Responsive Font Sizes (RFS) Bootstrap 5 alpha enables responsive font sizes by default. That will automatically resize the typography element according to the size of the user’s viewport through RFS engine or Responsive Font Sizes. As per RFS repository, RFS is a unit resizing engine. Furthermore, RFS offers the ability to resize every value for any CSS property with units, like padding, margin, box-shadow, or border-radius.   Enhanced grid system Bootstrap 5 alpha isn’t a complete departure from v4. Developers of the first alpha have made sure that everyone can be able to upgrade to this future version more easily. Here’s a rundown of what’s new in the grid system: New grid tier: XXL ≥1400px .gutter classes have been replaced with .g* utilities. Vertical spacing classes. Columns are no longer position: relative by default. Form layout options are replaced with the new grid system. Also, options for your grid gutter spacing are added.            Here’s a quick example of how to use the new grid gutter classes: <div class="row g-5">   <div class="col">...</div>   <div class="col">...</div>   <div class="col">...</div> </div> Drop Internet Explorer 10 and 11 support Well, Internet Explorer was talk of the town when it was released as it was the only browser to support Java applets and CSS. Although it is no longer relevant with Chrome, Firefox, and Edge as it doesn’t support modern JavaScript standards and CSS properties anymore. Which limits your web design potential. Hence, Bootstrap 5 alpha decided to drop the support for IE 10 and 11.   Improved customizing docs There are some good improvisations in the documentation. Like, removing ambiguity,  giving more explanation, and providing much more support for extending Bootstrap. It all starts with a whole new Customize section. The color palette is expanded in v5, too. With an extensive color system built-in, you can more easily customize the look and feel of your app without leaving the codebase. There is an improvisation in color contrast, too. Also, they have provided color contrast metrics in Color docs. Hopefully, this will continue to help make Bootstrap-powered sites more accessible to folks all over.   The custom SVG icon library With Bootstrap 3, there were 250 reusable icon components in the font format called ‘Glyphicons”.  After that, with Bootstrap 4, it was scrapped. Because of that developers had to rely on free icon fonts or their SVG (Scalable Vector Graphic) icons to add value to their web designs. However, with Bootstrap 5 alpha, they introduced a brand new SVG icon library, which adds a fresh touch. These icons are brilliantly crafted by @Mark Otto, co-founder of Bootstrap, himself.   Adding CSS custom properties As mentioned, Bootstrap 5 alpha has begun using CSS custom properties thanks to dropping support for Internet Explorer. In v4 there were only a handful of root variables for color and fonts, and now developers have added them for a handful of components and layout options. For example table component, there are a handful of local variables to make striped, hoverable, and active table styles easier .table { --bs-table-bg: #{$table-bg}; --bs-table-accent-bg: transparent; --bs-table-striped-color: #{$table-striped-color}; --bs-table-striped-bg: #{$table-striped-bg}; --bs-table-active-color: #{$table-active-color}; --bs-table-active-bg: #{$table-active-bg}; --bs-table-hover-color: #{$table-hover-color}; --bs-table-hover-bg: #{$table-hover-bg}; // Styles here... }   Improved Utilities API  Well, this is by far the most interesting aspect of Bootstrap 5 alpha.  A brand new utility API. By using the utility API from Bootstrap you have unlimited possibilities to create utility classes for positioning, spacing, sizing, and so on. For example, you will be able to easily expand the number of `m-*`, `p-*` classes, and so on without writing custom Sass code to expand them. Here’s an example showing us how the $ utility variable can be expanded by adding multiple values: $utilities: () !default; $utilities: map-merge(   (     // ...     "width": (       property: width,       class: w,       values: (         25: 25%,         50: 50%,         75: 75%,         100: 100%,         auto: auto       )     ),     // ...     "margin": (       responsive: true,       property: margin,       class: m,       values: map-merge($spacers, (auto: auto))     ),     // ...   ), $utilities); We think this will be a game-changer for those who build on Bootstrap via source files, and if you haven’t built a Bootstrap-powered project that way yet, your mind will be blown.    Migrating the documentation from Jekyll to Hugo If you know how WordPress, Drupal, or Joomla works, then, you might be aware of how Jekyll works. Well, Jekyll is a free and open-source static site generator. Besides, it is useful to build websites with easy to use navigation, website components and generates all the content at once. However, Bootstrap 5 alpha is all set to step up its game and is switched to Hugo.  As it is a fast and flexible static site generator.   CONCLUSION One of the frustrating experiences of being a developer is reinventing the base HTML, CSS, and JavaScript for each project. While some prefer to write their code, it still makes sense to just use an existing framework like Bootstrap. With the all-new updates that came with Bootstrap 5 Alpha, We can surely say that the Bootstrap team is making very progressive steps to make the framework simple, lightweight, faster, useful, and more responsive for the developer’s benefit.

Importance of going Digital in Covid-19 Pandemic 
Jun 22, 2020

Since January 2020, the world has been watching the unfolding of the Covid-19 pandemic. The infection has now reached just about every community on the planet leading to a current state of health crisis and economic uncertainty.  Most of us have felt a solid feeling of disturbance in our work and social lives. The whole world is all in all attempting to comprehend what occurs straightaway. The truth of the matter is that because of the interruption brought about by Covid-19 we have just experienced increasingly advanced change over the most recent 3 months than we have found over the most recent 25 years.   To completely guess what's going on, we should understand that these progressions will be driven by new mentalities and practices that our general public is presently beginning to embrace.  We are now observing impressions at it. Our exercises, for example, video conferencing, digital marketing and online ordering more effective than we at any point experienced previously. A considerable lot of these emphatically adjusted new arrangements will keep on existing significantly after the crisis is finished. Like cleanliness appraisals on food conveyance applications and acknowledgment of social separating procedures. Later, rather than asking "why meet over video?", individuals may ask, "for what reason do we have to meet face to face?"  If at any time there was a time for digital marketing to come to the fore, it is currently. Advertisements that would for the most part have been seen by thousands presently remain adjacent to discharge avenues, fewer people are wandering out to get papers and no one is holding occasions/events.   Meanwhile, with such huge numbers of people on the web and for more, the possibilities of seeing promotions via web-based networking media or interfacing with content advertising websites are more noteworthy.   There are some certain targets digital marketers can concentrate on, with numerous greater demands just now:  Search Engine Optimization  Social Media Marketing  Email Marketing  Content Marketing  Paid advertising (PPC, SEM, etc.)  The mentality change could be "We are in a battle with the virus since it has accomplished more harm than contending nations" or "We see a development of work at home economy (Online Work)". Researchers and specialists will be set on a higher platform than they are right now and the legislature will be relied upon to contribute and create framework and science that thinks about the populace. The changing outlooks could likewise move the viewpoints via web-based networking media.   Benefits of Online Marketing: Convenience and Quick Service  Low Cost for Operations  Measure and Track Results  Demographic Targeting  Global Marketing  Ability to Multitask  24/7 Marketing  Instance Transaction Service  Time-Effective Marketing  In outline, our mentality and practices have just begun to change. Unmistakably, the world will never be the equivalent in light of the fact that there will be durable impacts. After we settle essential services, we will probably change our objectives for advancement and strategy using digital services. From this rationale, we can see that the world's necessity for computerized innovation and advanced change is just quickening.

Table partitioning in SQL
Jun 10, 2020

What is table partitioning in SQL? Table partitioning is a way to divide a large table into smaller, more manageable parts without having to create separate tables for each part. Data in a partitioned table is physically stored in groups of rows called partitions and each partition can be accessed and maintained separately. Partitioning is not visible to end-users, a partitioned table behaves like one logical table when queried. Data in a partitioned table is partitioned based on a single column, the partition column often called the partition key. Only one column can be used as the partition column, but it is possible to use a computed column. The partition scheme maps the logical partitions to physical filegroups. It is possible to map each partition to its own filegroup or all partitions to one filegroup.