Category - C

Why PESTLE Analysis is important for your Business?
Oct 16, 2020

Marketing is an important aspect for an Organization's products and services. The right Marketing Strategy allows Organizations to achieve their respective goals and also helps to have an upper-hand in this Competitive Market. Enterprises worldwide conduct various short-term and long-term analysis to assess their business environment. One such analysis is PESTLE Analysis.   What is PESTLE Analysis (Macro-Environmental Factors)? It is a framework/tool used to examine and monitor the “Macro-Environmental Factors” that may have an extreme impact on an Organization’s performance. This tool is very useful when starting a new business or expanding your business Globally.   Now, let’s dive deeper and understand which are those factors that makes the PESTLE Analysis:   Political Factor - All the influences that a Government has on your business could be classified here. This can include Government Policy, Political Stability or Instability, Corruption, Foreign Trade Policy, Tax Policy, Labor Law, Environmental Law and Trade Restrictions, etc… Political factors have an effect on Organization to a huge extent as any decisions made by Government regarding Laws and Policies will have an direct impact on the business. Political and Government intervention (interference) is not feasible for the business and can hurt the business setup in the long run. Similarly, Trade Regulations imposed by the State Government might impact an Organization’s cost of labor and their ability to target an Overseas Market.  Example - Let’s say, currently “Govt. A” is in the ruling power and Govt. A has made the decision for “Globalization”. So naturally, Overseas Companies will be eager to invest. Now due to Globalization, they will invest as much as they can to expand their business and will leave no stone unturned to make their Company's Name a trusted one. Now, to sell their products, they will open Factories and Stores in various states. But now what if in the next Elections “Govt. A” lose the elections and new “Govt. B” becomes the ruling power and denies Globalization? Just imagine to what extent that Overseas Companies who invested so much in the country have to suffer? So, Political Factor is The Most Important Factor for your Business. PERIOD.   Economical Factor - Marketing is directly proportional with the Economic Factors. Organizations won’t be able to launch or promote their product and service appropriately if the Economy isn't steady and progressing. Economic Factors have a direct or indirect long term impact on an Organization, since it affects the purchasing power of consumers and can possibly change the demand/supply models within the Economy. However,  it also affects the way Companies price their products and services. Economical Factors includes GDP, Economic Growth, Exchange Rates, Inflation Rates, Interest Rates, Disposable Income of Consumers and Unemployment Rates.    Social Factor - Social Factor includes Demographic, Customs, Norms, Culture and Population within which the Organization operates.  This also includes Population Trends such as the Population Growth Rate, Age Distribution, Income Distribution, Career Choices, Safety Importance, Health Consciousness, Lifestyle Trends and Cultural Barriers. These factors are especially important for marketers when targeting niche customers. Example - Social Factor is the reason why Youth-Centric Start-ups, Companies, Products and Services are doing very well in many countries, especially INDIA.   Technology - It refers to the Innovations in Technology that may affect the operations of an Organization, favorably or unfavorably. In today’s era, the impact of Technology is enormous and multifaceted. Marketing Strategies must focus on the constant changing Technology as Technology strengthens the business structure.  Technology includes Technology Incentives, Level of Innovation, Automation, Research and Development (R&D) Activity, Technological Change and the amount of Technological Awareness that a market possesses. These factors influence the decisions whether to enter into certain industries or not, whether to launch certain products or to outsource production activities abroad or not. By knowing what is going on in the current technology-market, you may be able to prevent your company from spending a lot of money on developing a technology that would become outdated very soon.   Legal Factor - When it comes to the Legal Factor, Organizations need to be crystal clear about what is legal and what is not in order to trade successfully and ethically.  This situation (Legal Factor) becomes even more tricky when an Organization is trading globally because every country has its own Rules and Regulations. However, you also need to be aware of the possible changes in legalization rules by the Government which may affect your business in the future. In such situations, having a “Legal Advisor” could prove very beneficial.   Environmental Factor - Organizations cannot survive if the Environment is not suitable or favorable. However, Environmental Factors are not in control of the Organizations but Organizations need to be clear about whether the products/services they offer are suitable to the respective Environment or not, before setting up the business . Location impacts the Organization’s marketing plans since changes in climate and consumers’ desire for a particular offering can be an issue. Weather, Climate, Environmental Policies, Natural Disasters, Air and Water Pollution, Recycling Standards, Support for Renewable Energy all these affect the long-term planning of an Organization.   Conclusion - PESTLE Analysis provides a vast study of the external and internal factors that pose a challenge to the Organization’s Marketing Strategy. To successfully achieve marketing goals, Organizations must use PESTLE Analysis to assess the market conditions so that they can plan and strategize accordingly.

What is Mobile App Testing?
Oct 07, 2020

There are so many mobile applications launch in the market every day but there are few successors, Mobile app testing is a process that finds bugs, issue and glitches into mobile application. In such high competency ratio, you have to be sure that apart from offering something innovative and interesting to your customers your application should also be free from any glitches. And hence, mobile app testing is becoming very important. Your application must also be free from flaws. And therefore mobile app testing is becoming very important.   What is Mobile app Testing..? Mobile application testing is a process by which application software developed for portable mobile devices is tested for functionality, usability, and consistency. Mobile app testing can be an automated or manual type of testing.   What are Mobile application testing necessary? Turning on/off GPS Different devices Use different OSs Use different Screen resolution Testing in different Screen orientation (landscape, portrait), Others…   Types of Mobile Application Mobile web Application Hybrid Application Native Application   Mobile Web Applications: Mobile web applications refer to applications for mobile devices that only require the installation of a web browser on the device. Simple mobile web applications limit the use of RIA technologies and are designed to present information in a readable and action-oriented format. These are the web page that you open into the mobile browser this call mobile web application.   Native Application: A native application is a software program that is developed for use on a particular platform or device. Like windows 10, android app, ios app, blackberry.   Hybrid Application: A (hybrid application) is a software application that combines elements of native applications and web applications. Hybrid applications are essentially web applications that have been placed in a native application shell.   5 Major Types of Mobile application Testing: Usability Testing: It checks how friendly's the app is in terms of use and intuitiveness. Mobile apps need to be test early and very often.User experience requirements include clarity of navigation, intuitive interface, appearance of application design, error messages, and handling.   Function Testing: The application is working correctly. This type of test focuses on the main purpose and flow of the application, Ensuring that all of its functions are responsive and within specification.   Performance Testing: Performance testing is a key element in the mobile app testing process. You will be able to track and predict performance changes for spikes in connection quality (3G, 4G, LTE), change in a user's location, increased traffic, etc… When it comes to mobile applications, you should also test the product on different devices to see if performance is affected by the change in screen dimensions.   Installation Testing: The application installation test tests the successful installation of your mobile application on various mobile devices, models and operating systems.   Mobile Device Testing: Mobile device testing function to ensure the quality of mobile devices, such as mobile phones, PDAs, etc. The tests will be carried out on both hardware and software.   Conclusion: Mobile app testing has become a critical part of mobile app development. Most of the problems faced by an app could be solved by a successful mobile app test. This also increases time to market and ensures the success of the application. There are so many mobile applications launch in the market every day but there are few successors, Mobile app testing is a process that finds bugs, issue and glitches into mobile application.

Basics of SSIS(SQL Server Integration Service)
Sep 14, 2020

What is SSIS ? SSIS is a platform for data integration and workflow applications. It features a data warehousing tool used for data extraction, transformation, and loading (ETL). The tool may also be used to automate maintenance of SQL Server databases and updates to multidimensional cube data. SQL Server Integration Service (SSIS) is a component of the Microsoft SQL Server database software that can be used to execute a wide range of data migration tasks. SSIS is a fast & flexible data warehousing tool used for data extraction, loading and transformation like cleaning, aggregating, merging data, etc. It makes it easy to move data from one database to another database. SSIS can extract data from a wide variety of sources like SQL Server databases, Excel files, Oracle and DB2 databases, etc. SSIS also includes graphical tools & wizards for performing workflow functions like sending email messages, FTP operations, data sources, and destinations. Features of SSIS: Organized and lookup transformations Tight integration with other Microsoft SQL family Provides rich Studio Environments Provides a lot of data integration functions for better transformations High-speed data connectivity   Why SSIS? Extract, Transform, and Load (ETL) data from SQL Server to a file and also from file to SQL. sending an email. Download the File from FTP. Rename ,Delete , Move File From Defined Path. It allows you to join tables from different databases (SQL, Oracle, etc...) and from potentially different servers.   How SSIS Works? SSIS consists of three major components, mainly: Operational Data: An operational data store (ODS) is a database designed to integrate data from multiple sources for additional operations on the data. This is the place where most of the data used in the current operation is housed before it’s transferred to the data warehouse for longer-term storage or archiving. ETL process: ETL is a process to Extract, Transform and Load the data. Extract, Transform and Load (ETL) is the process of extracting the data from various sources, transforming this data to meet your requirement and then loading into a target data warehouse. ETL provides a ONE STOP SOLUTION for all these problems. Extract: Extraction is the process of extracting the data from various homogeneous or heterogeneous data sources based on different validation points. Transformation: In transformation, entire data is analyzed and various functions are applied on it in order to load the data to the target database in a cleaned and general format. Load: Loading is the process of loading the processed data to a target data repository using minimal resources. Data Warehouse Datawarehouse captures the data from diverse sources for useful analysis and access. Data warehousing is a large set of data accumulated which is used for assembling and managing data from various sources for the purpose of answering business questions. Hence, helps in making decisions.   How to install SSDT(Sql Server Data Tools)? Prerequisite and environment Setup for SSIS Project For Starting SSIS we need 2 Studios SQL Server Data Tools (SSDT) for developing the Integration Services packages that a business solution requires. SQL Server Data Tools (SSDT) provides the Integration Services project in which you create packages. Installation Steps: Download SSDT setup from Microsoft website.   URL: When you open the .exe file, you will be asked to restart the system before installation. So, restart first and Run Setup. And press Next. It will show the tools required and the features such as SQL Server Database, SSAS(SQL Server Analysis Services), SSRS(SQL Server Reporting Services) and SSIS(SQL Server Integration Services). Make sure you check SSIS and click the “install” button. Refer the below screenshot for the same.   We will see following contents In SSIS: Variables Connection Manager SSIS Toolbox Container Tasks Data Flow Task   Variable: Variables store values that a SSIS package and its containers, tasks, and event handlers can use at runtime.   System variables : Defined by Integration Services SSIS provides a set of system variables that store information about the running package and its objects. These variables can be used in expressions and property expressions to customize packages, containers, tasks, and event handlers.   User-Defined variables : Defined by Package Developers   How to create user - define  variable?   How to set expression for variable   Connection Manager: SSIS provides different types of connection managers that enable packages to connect to a variety of data sources and servers: There are built-in connection managers that Setup installs when you install Integration Services. There are connection managers that are available for download from the Microsoft website. You can create your own custom connection manager if the existing connection managers do not meet your needs.   Let's see how we can add Connection Manager. 1)Solution Explorer > Connection Managers > New Connection Manager . You can see the list of connection managers for different type of connections.   2)Add connection manager.   After adding your connection. you can see the all connection here.   SSIS Toolbox: Steps: Menu bar > SSIS > select SSIS Toolbox. now, you can see SSIS Toolbox on the left side. SSIS Toolbox have list of tasks and containers that you can perform.   List of Containers: For each Loop Container : Runs a control flow repeatedly by using an enumerator. For Loop Container : Runs a control flow repeatedly by testing a condition. Sequence Container : Groups tasks and containers into control flows that are subsets of the package control flow.   List of Task: Data Flow Task The task that runs data flows to extract data, apply column level transformations, and load data.   Data Preparation Tasks These tasks do the following processes: copy files and directories; download files and data; run Web methods; apply operations to XML documents; and profile data for cleansing.   Workflow Tasks The tasks that communicate with other processes to run packages, run programs or batch files, send and receive messages between packages, send e-mail messages, read Windows Management Instrumentation (WMI) data, and watch for WMI events.   SQL Server Tasks The tasks that access, copy, insert, delete, and modify SQL Server objects and data.   Scripting Tasks The tasks that extend package functionality by using scripts.   Analysis Services Tasks The tasks that create, modify, delete, and process Analysis Services objects.   Maintenance Tasks The tasks that perform administrative functions such as backing up and shrinking SQL Server databases, rebuilding and reorganizing indexes, and running SQL Server Agent jobs. you can add task/container by drag the task/container from SSIS toolbox to design area.   Data Flow Task : Drag the Data Flow task from SSIS Toolbox to design area and double click on it. you are now in Data flow tab. now you can see that SSIS Toolbox has different components.   Type: Source : from where you want your data. Destination : it is where you want to move your data. Transformation : It is Operation that perform ETL(Extract, Transform, Load)   Conclusion: SQL Server Integration Services provide tasks to transform and validate data during the load process and transformations to insert data into your destination. Rather than create a stored procedure with T-SQL to validate or change data, is good to know about the different SSIS tasks and how they can be used.   RELATED BLOGS: Create SSIS Data Flow Task Package Programmatically 

Test Automation using Robot Framework
Aug 31, 2020

In this article, I will explain how we can prepare and manage automation test cases in a better way with Robot Framework. As an example, let’s look at below list of test cases. In the above image, first column is Test Case Number Second column is Page name and Third column is actual test case that we need to automate. So we need to write automation scripts for Payer - Order details page The very first step that we need to take is to prepare page wise resource file in Robot Framework.   For our example, Order.robot In this file, we will define different keywords based on different functionalities of Order page. For example, there will be one keyword defined to open Order menu item i.e. Open Order Menu – Payer We can define one more keyword to search order number with the use of filter. Similarly, we can define other keywords like opening of any order based on search results etc. See the below image:   This way we can define various keywords based on page object model for different pages. Now, for our actual test cases, we simply need to call these keywords as and when required after importing its respective resource files.   As you can see in the above image, we can simply convert manual test steps into automation steps. We can add verification points in our test cases based on test case requirement and group all these test cases under one test suite. We can run the test suit and on successful completion of test run, we will get report similar to below image.   This is the fastest way to implement test automation using Robot Framework. Hope this information will be useful to you if you are starting a new automation project and analyzing how to manage automation scripts in a better way.

Why Software Testing is Important in software development life cycle?
Aug 24, 2020

Testing is important as it discovers defects/errors before delivery to the customer, ensuring the quality of the software. It makes the software more reliable and easier to use. Thoroughly tested software ensures reliable, high-performance software operation. For example, assume you are using an Online Shopping application to Place any order and Pay the amount of order using any online platform. So, you initiate the transaction, get a successful transaction message, and the amount also deducts from your account. However, your Order is not placed yet. Likewise, your account is also not reflecting the reversed transaction. This will surely make you upset and leave you as an unsatisfied customer.   Software testing contributes to the overall success of the project While requirement review and creating project timeline testers involvement - while review project requirement tester help to identification of some of the requirement defects even before their implementation. this help in Project cost and timeline. While the Design phase tester works with the system designers - it will increase the quality of design. it will help in reducing design flaws and make it user friendly. While Project Developments tester works with developers - Testers are aware of areas that are considered risky by the developer so they can adjust their preferences accordingly. In addition, the developers also get the tester's insights. Helps to recreate errors there and then, without going through a lengthy defect management process. Before the Software release testers verifying and validating the software - This helps detect errors that might otherwise go unnoticed, and supports the process of removing the defects that caused the failures. Running tests at multiple levels increase the likelihood that the software will have fewer bugs and also meet customer needs.   Reasons Why Software Testing is Important 1. Identification of Bugs and Defects Testing phase is one phase that determines the bugs and errors in the application. These bugs can be at the unit level or system level. With so many testing phases, you get to keep away from any kind of bugs that may be affecting your application.   2. Improvement in Product Quality As testing is a phase that helps in knowing the actual outcome and the expected outcome. Surely, it can help you to improve your product quality. With proper testing done, you come out of errors and develop perfect software applications for the users.   3. Technical Importance In any SDLC lifecycle, the testing phase is important for the technical aspect, as it has to come out with the best technically sound application.   4. Improved Quality Properly tested software gives you more confidence in coming up with great software. Secondly, it improves the quality of your software application as continuous and all kinds of testing modes have made a secure and safe application that can be used by the end-users.   5. Verification and validation One of the main aims of the testing phase in SDLC is for verification and validation. Testing is a phase that can serve as metrics as it is used heavily in the V&V method. Based on the result, you could come out with a comparison between qualities amongst different products.   6. Reliability Estimation  This is yet another important factor that is determined by the testing phase. If your software application has gone through all small levels like unit testing and major testing like regression testing and all, then surely it is a reliable application. Hence testing determines the reliability of your software application. Testing may serve as the best statistical method that defines operational testing on application resulting in a reliable software product.   7. Prove Usability and Operability One very important aim of software testing is to prove the software is both usable and operable. Usability testing is where the software is released to a select group of users and their working with the product is observed. All aspects of a user's interaction with the software, like the ease of use and where users are facing problems, are recorded and analyzed   Conclusion: To conclude, the importance of software testing is imperative. Software testing is a crucial component of software product development because it improves consistency and performance. The main benefit of testing is the identification and subsequent removal of the errors. However, testing also helps developers and testers to compare actual and expected results in order to improve quality. If the software production happens without testing it, it could be useless or sometimes dangerous for customers. So, a tester should wear a unique hat that protects the reliability of the software and makes it safe to use in real-life scenarios.  

Kafka with ELK implementation
Aug 17, 2020

Apache Kafka is the numerous common buffer solution deployed together with the ELK Stack. Kafka is deployed within the logs delivery and the indexing units, acting as a segregation unit for the data being collected: In this blog, we’ll see how to deploy all the components required to set up a resilient logs pipeline with Apache Kafka and ELK Stack: Filebeat – collects logs and forwards them to a Kafka topic. Kafka – brokers the data flow and queues it. Logstash – aggregates the data from the Kafka topic, processes it and ships to Elasticsearch. Elasticsearch – indexes the data. Kibana – for analyzing the data.   My environment: To perform the steps below, I set up a single Ubuntu 18.04 VM machine on AWS EC2 using local storage. In real-life scenarios, you will probably have all these components running on separate machines. I started the instance in the public subnet of a VPC and then set up a security group to enable access from anywhere using SSH and TCP 5601 (for Kibana). Using Apache Access Logs for the pipeline, you can use VPC Flow Logs, ALB Access logs etc. We will start by installing the main component in the stack — Elasticsearch. Login to your Ubuntu system using sudo privileges. For the remote Ubuntu server using ssh to access it. Windows users can use putty or Powershell to log in to Ubuntu system. Elasticsearch requires Java to run on any system. Make sure your system has Java installed by running the following command. This command will show you the current Java version. sudo apt install openjdk-11-jdk-headless Check the installation is successful or not by the below command ~$ java — versionopenjdk 11.0.3 2019–04–16OpenJDK Runtime Environment (build 11.0.3+7-Ubuntu-1ubuntu218.04.1)OpenJDK 64-Bit Server VM (build 11.0.3+7-Ubuntu-1ubuntu218.04.1, mixed mode, sharing) Finally, I added a new elastic IP address and associated it with the running instance. The example logs used for the tutorial are Apache access logs.   Step 1: Installing Elasticsearch We will start by installing the main component in the stack — Elasticsearch. Since version 7.x, Elasticsearch is bundled with Java so we can jump right ahead with adding Elastic’s signing key: Download and install the public signing key: wget -qO - | sudo apt-key add - Now you may need to install the apt-transport-https package on Debian before proceeding: sudo apt-get install apt-transport-https echo "deb stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-7.x.list Our next step is to add the repository definition to our system: echo “deb stable main” | sudo tee -a /etc/apt/sources.list.d/elastic-7.x.list You can install the Elasticsearch Debian package with: sudo apt-get update && sudo apt-get install elasticsearch Before we bootstrap Elasticsearch, we need to apply some basic configurations using the Elasticsearch configuration file at: /etc/elasticsearch/elasticsearch.yml: sudo su nano /etc/elasticsearch/elasticsearch.yml Since we are installing Elasticsearch on AWS, we will bind Elasticsearch to the localhost. Also, we need to define the private IP of our EC2 instance as a master-eligible node: "localhost" http.port:9200 cluster.initial_master_nodes: ["<InstancePrivateIP"] Save the file and run Elasticsearch with: sudo service elasticsearch start To confirm that everything is working as expected, point curl to: http://localhost:9200, and you should see something like the following output (give Elasticsearch a minute or two before you start to worry about not seeing any response): {   "name" : "elasticsearch",   "cluster_name" : "elasticsearch",   "cluster_uuid" : "W_Ky1DL3QL2vgu3sdafyag",   "version" : {     "number" : "7.2.0",     "build_flavor" : "default",     "build_type" : "deb",     "build_hash" : "508c38a",     "build_date" : "2019-06-20T15:54:18.811730Z",     "build_snapshot" : false,     "lucene_version" : "8.0.0",     "minimum_wire_compatibility_version" : "6.8.0",     "minimum_index_compatibility_version" : "6.0.0-beta1"   },   "tagline" : "You Know, for Search" }   Step 2: Installing Logstash Next up, the “L” in ELK — Logstash. Logstash and installing it is easy. Just type the following command. sudo apt-get install logstash -y Next, we will configure a Logstash pipeline that pulls our logs from a Kafka topic, processes these logs and ships them on to Elasticsearch for indexing. Verify Java is installed: java -version openjdk version "1.8.0_191" OpenJDK Runtime Environment (build 1.8.0_191-8u191-b12-2ubuntu0.16.04.1-b12) OpenJDK 64-Bit Server VM (build 25.191-b12, mixed mode) Let’s create a new config file: Since we already defined the repository in the system, all we have to do to install Logstash is run: sudo nano /etc/logstash/conf.d/apache.conf Next, we will configure a Logstash pipeline that pulls our logs from a Kafka topic, processes these logs, and ships them on to Elasticsearch for indexing. Let’s create a new config file: input {   kafka {     bootstrap_servers => "localhost:9092"     topics => "apache"     } } filter {     grok {       match => { "message" => "%{COMBINEDAPACHELOG}" }     }     date {     match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]     }   geoip {       source => "clientip"     } } output {   elasticsearch {     hosts => ["localhost:9200"]   } } As you can see — we’re using the Logstash Kafka input plugin to define the Kafka host and the topic we want Logstash to pull from. We’re applying some filtering to the logs and we’re shipping the data to our local Elasticsearch instance.   Step 3: Installing Kibana Let’s move on to the next component in the ELK Stack — Kibana. As before, we will use a simple apt command to install Kibana: sudo apt-get install kibana We will then open up the Kibana configuration file at: /etc/kibana/kibana.yml, and make sure we have the correct configurations defined: server.port: 5601 "<INSTANCE_PRIVATE_IP>" elasticsearch.hosts: ["http://<INSTANCE_PRIVATE_IP>:9200"] Then enable and start the Kibana service: sudo systemctl enable kibana sudo systemctl start kibana We would need to install Firebeat. Use: sudo apt install filebeat   Open up Kibana in your browser with http://<PUBLIC_IP>:5601. You will be presented with the Kibana home page.

Easy jQuery Trick
Aug 10, 2020

1. Back to top button $('').click(function() { $(document.body).animate({scrollTop : 0},800); return false; }); //Create an anchor tag <a class=”top” href=”#”>Back to top</a> As you can see using the animate and scrollTop functions in jQuery we don’t need a plugin to create a simple scroll to top animation. By changing the scrollTop value we can change where we want the scrollbar to land, in my case I used a value of 0 because I want it to go to the very top of our page, but if I wanted an offset of 100px I could just type 100px in the function. So all we are really doing is animating the body of our document throughout the course of 800ms until it scrolls all the way to the top of the document. 2. Checking if images are loaded $(‘img’).load(function() { console.log(‘image load successful’); }); Sometimes you need to check if your images are fully loaded in order to continue with your scripts, this three line jQuery snippet can do that for you easily. You can also check if one particular image has loaded by replacing the img tag with an ID or class.   3. Fix broken images automatically $('img').error(function(){ $(this).attr('src', ‘img/broken.png’); }); Occasionally we have times when we have broken image links on our website and replacing them one by one isn’t easy, so adding this simple piece of code can save you a lot of headaches. Even if you don’t have any broken links adding this doesn’t do any harm.   4. Toggle class on hover $(‘.btn').hover(function(){ $(this).addClass(‘hover’); }, function(){ $(this).removeClass(‘hover’); }); We usually want to change the visual of a clickable element on our page when the user hovers over and this jQuery snippet does just that, it adds a class to your element when the user is hovering and when the user stops it removes the class, so all you need to do is add the necessary styles in your CSS file.   5. Disabling input fields $('input[type="submit"]').attr("disabled", true); On occasion you may want the submit button of a form or even one of its text inputs to be disabled until the user has performed a certain action (checking the “I’ve read the terms” checkbox for example) and this line of code accomplishes that; it adds the disabled attribute to your input so you can enable it when you want to. To do that all you need to do is run the removeAttr function on the input with disabled as the parameter: $('input[type="submit"]').removeAttr("disabled”);   6. Stop the loading of links $(‘').click(function(e){ e.preventDefault(); }); Sometimes we don’t want links to go to a certain page or even reload it, we want them to do something else like trigger some other script and in that case this piece of code will do the trick of preventing the default action.   7. Toggle fade/slide // Fade $( “.btn" ).click(function() { $( “.element" ).fadeToggle("slow"); }); // Toggle $( “.btn" ).click(function() { $( “.element" ).slideToggle("slow"); }); Slides and Fades are something we use plenty in our animations using jQuery, sometimes we just want to show an element when we click something and for that the fadeIn and slideDown methods are perfect, but if we want that element to appear on the first click and then disappear on the second this piece of code will work just fine.   8. Make two divs the same height $(‘.div').css('min-height', $(‘.main-div').height());   9. Self-closing elements When inserting empty elements into the DOM, you can use self-closing tags: $('#el1').append('<table class="holly" />'); $('#el2').before('<p class="user-message note" />');   10. Inserting content —Before, after, prepend or append? When inserting elements into an element, use .prepend() or .append(). The difference is that prepend inserts the new element into the start (i.e. insert element as the first child), while append does the exact opposite, inserting the element as the last child: $('div').prepend('<p>Hello!</p>').append('<p>Goodbye!</p>');   11. Loopy doopy do! Very often people wonder how to loop through each element individually, and then target the current element. For this purpose, we can use .each() and $(this). Let’s say we want to add a class to all paragraphs, based on their index. If you are working on a static page, you might be tempted to hardcode it: $('p').each(function (i) {     $(this).addClass('para-' + (i+1)); } Remember that i is a zero-based index, i.e. it starts from 0 for the first encountered element.   12. Automatic vendor prefixes jQuery automatically adds vendor prefixes when it sees fit — therefore saving you the headache of adding vendor prefixes yourself: $('.box').css({ 'transform': 'translate(100,100) scale(2)' }); 13. Selectors As previously mentioned, one of the core concepts of jQuery is to select elements and perform an action. jQuery has done a great job of making the task of selecting an element, or elements, extremely easy by mimicking that of CSS. On top of the general CSS selectors, jQuery has support for all of the unique CSS3 selectors, which work regardless of which browser is being used. $('.feature');           // Class selector $('li strong');          // Descendant selector $('em, i');              // Multiple selector $('a[target="_blank"]'); // Attribute selector $('p:nth-child(2)');     // Pseudo-class selector 14. 'this' Selection Keyword When working inside of a jQuery function you may want to select the element in which was referenced inside of the original selector. In this event the this keyword may be used to refer to the element selected in the current handler. $('div').click(function(event){    $(this); });   15. jQuery Selection Filters Should CSS selectors not be enough there are also custom filters built into jQuery to help out. These filters are an extension to CSS3 and provide more control over selecting an element or its relatives. $('div:has(strong)');   16. Slide Demo //HTML <div class="panel">   <div class="panel-stage"></div>   <a href="#" class="panel-tab">Open <span>&#9660;</span></a> </div>   //Function $('.panel-tab').on('click', function(event){   event.preventDefault();   $('.panel-stage').slideToggle('slow', function(event){     if($(this).is(':visible')){       $('.panel-tab').html('Close <span>&#9650;</span>');     } else {       $('.panel-tab').html('Open <span>&#9660;</span>');     }   }); });   17. Tabs Demo //HTML <ul class="tabs-nav">   <li><a href="#tab-1">Features</a></li>   <li><a href="#tab-2">Details</a></li> </ul> <div class="tabs-stage">   <div id="tab-1">...</div>   <div id="tab-2">...</div> </div> // Show the first tab by default $('.tabs-stage div').hide(); $('.tabs-stage div:first').show(); $('.tabs-nav li:first').addClass('tab-active'); // Change tab class and display content $('.tabs-nav a').on('click', function(event){   event.preventDefault();   $('.tabs-nav li').removeClass('tab-active');   $(this).parent().addClass('tab-active');   $('.tabs-stage div').hide();   $($(this).attr('href')).show(); });

POWER BI Report Integration
Aug 04, 2020

Here we will see how we can access Power BI Embed reports and Dashboards in ASP.NET MVC applications. Let’s start with a small introduction to Power BI. Power BI is a reporting tool provided by Microsoft. It is accessible in two forms windows version and web version. Using windows version, we can design reports and then publish it to the Power BI server and Web version Power BI can be used to view that Published report. Power BI server URL:   Prerequisite to Start with Power BI: Visual Studio 2017 ( Power BI Desktop ( Azure subscription to register the application. ( Please follow the below steps to set up the Environment before starting with integration. Create an Active Directory on AZURE portal Register Application on Azure Portal. Note: Copy these details to somewhere User Id, Password, ClientID, Client Secret Key Grant All Access Permissions to application. Later on, you can remove extra permissions as per requirement. Register on the Power BI server to host reports and create the workspace. Create one sample report and Deploy on the Workspace we create on power BI server We can access this report directly without any authorization with the help of the URL. In that case, there is no data security. That’s why we are using Power BI Embed, and this report will be loaded with the help of Embedded URL and We’ll use Authorization Token for Authentication.   Power Integration Steps Generate Bearer token for Authenticate your application with Azure Active Directory with help of User Id, Password, ClientID, Client Secret key of Azure portal. We are going to generate this with the help of Postman. This bearer token will be used to access all the APIs related to Power BI. I.E. Get Workspace list inside Active Directory, Get Workspace items list, etc. API URL for generating Bearer token:    Now, we have Bearer Token. We’ll get Workspace (Groups) list with the help of that bearer token. Below API will provide Workspace Details. Like Workspace Name, id, IsReadOnly, etc. API URL:   Now we have bearer Token and group details. Based on these all details. We’ll get report details which we hosted for integration.  We have different API methods for fetch report details, which will be used to integrate with our application. It will provide report id, name, report type, Web URL, Embed URL, Dataset Id, etc. Web URL: It can be accessed without authentication, which is open for all. You can access reports by browsing the URL in the browser directly. Embed URL: It is Secure URL which we are going to use for secure integration purpose. API URL:{GroupID}/reports   Now we have all the required details to access the Power BI report. But still, we cannot access the report. It will require an Authentication token for that specific report to access it.  So, we’ll generate a Token for it. API URL:{GroupID}/reports/{ReportId}/generatetoken     We’ll Use Embed URL and Report Authentication Token to integrate a report with Existing Application.   Create a sample MVC application Create All the API methods as explained in step 1 to 4 Add below code in the controller   Add the below code in the view. Now, Run the project it will display sample reports in a web browser. Conclusion: After reading the blog, We have learned about Power BI API and the integration of those APIs into ASP.NET MVC application.

Create SSIS Data Flow Task Package Programmatically
Jul 27, 2020

In this article, we will review how to create a data flow task package of SSIS in Console Application with example. Requirements Microsoft Visual Studio 2017 SQL Server 2014 SSDT Article  Done with the above requirements? Lets start by launching Microsoft Visual Studio 2017. Create a new Console Project with .Net Core.  After created new project provide proper name to it. In Project Explorer import relevant references and ensure that you have declared namespaces as below: using Microsoft.SqlServer.Dts.Pipeline.Wrapper; using Microsoft.SqlServer.Dts.Runtime; using RuntimeWrapper = Microsoft.SqlServer.Dts.Runtime.Wrapper;   To import above namespaces we need to import below refrences.   We need to keep in mind that, above all references should have same version.   After importing namespaces, ask user for the source connection string, destination connection string and table that will be copied to destination. string sourceConnectionString, destinationConnectionString, tableName; Console.Write("Enter Source Database Connection String: "); sourceConnectionString = Console.ReadLine(); Console.Write("Enter Destination Database Connection String: "); destinationConnectionString = Console.ReadLine(); Console.Write("Enter Table Name: "); tableName = Console.ReadLine();   After Declaration, create instance of Application and Package. Application app = new Application(); Package Mipk = new Package(); Mipk.Name = "DatabaseToDatabase";   Create OLEDB Source Connection Manager to the package. ConnectionManager connSource; connSource = Mipk.Connections.Add("ADO.NET:SQL"); connSource.ConnectionString = sourceConnectionString; connSource.Name = "ADO NET DB Source Connection";   Create OLEDB Destination Connection Manager to the package. ConnectionManager connDestination; connDestination= Mipk.Connections.Add("ADO.NET:SQL"); connDestination.ConnectionString = destinationConnectionString; connDestination.Name = "ADO NET DB Destination Connection";   Insert a data flow task to the package. Executable e = Mipk.Executables.Add("STOCK:PipelineTask"); TaskHost thMainPipe = (TaskHost)e; thMainPipe.Name = "DFT Database To Database"; MainPipe df = thMainPipe.InnerObject as MainPipe;   Assign OLEDB Source Component to the Data Flow Task. IDTSComponentMetaData100 conexionAOrigen = df.ComponentMetaDataCollection.New(); conexionAOrigen.ComponentClassID = "Microsoft.SqlServer.Dts.Pipeline.DataReaderSourceAdapter, Microsoft.SqlServer.ADONETSrc, Version=, Culture=neutral, PublicKeyToken=89845dcd8080cc91"; conexionAOrigen.Name = "ADO NET Source";   Get Design time instance of the component and initialize it. CManagedComponentWrapper instance = conexionAOrigen.Instantiate(); instance.ProvideComponentProperties();   Specify the Connection Manager. conexionAOrigen.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.GetExtendedInterface(connSource); conexionAOrigen.RuntimeConnectionCollection[0].ConnectionManagerID = connSource.ID;   Set the custom properties. instance.SetComponentProperty("AccessMode", 0); instance.SetComponentProperty("TableOrViewName", "\"dbo\".\"" + tableName + "\"");   Reinitialize the source metadata. instance.AcquireConnections(null); instance.ReinitializeMetaData(); instance.ReleaseConnections();   Now, Add Destination Component to the Data Flow Task. IDTSComponentMetaData100 conexionADestination = df.ComponentMetaDataCollection.New(); conexionADestination.ComponentClassID = "Microsoft.SqlServer.Dts.Pipeline.ADONETDestination, Microsoft.SqlServer.ADONETDest, Version=, Culture=neutral, PublicKeyToken=89845dcd8080cc91"; conexionADestination.Name = "ADO NET Destination";   Get Design time instance of the component and initialize it. CManagedComponentWrapper instanceDest = conexionADestination.Instantiate(); instanceDest.ProvideComponentProperties();   Specify the Connection Manager. conexionADestination.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.GetExtendedInterface(connDestination); conexionADestination.RuntimeConnectionCollection[0].ConnectionManagerID = connDestination.ID;   Set the custom properties. instanceDest.SetComponentProperty("TableOrViewName", "\"dbo\".\"" + tableName + "\"");   Connect the source to destination component: IDTSPath100 union = df.PathCollection.New(); union.AttachPathAndPropagateNotifications(conexionAOrigen.OutputCollection[0], conexionADestination.InputCollection[0]);   Reinitialize the destination metadata. instanceDest.AcquireConnections(null); instanceDest.ReinitializeMetaData(); instanceDest.ReleaseConnections();   Map Source input Columns and Destination Columns foreach (IDTSOutputColumn100 col in conexionAOrigen.OutputCollection[0].OutputColumnCollection) {     for (int i = 0; i < conexionADestination.InputCollection[0].ExternalMetadataColumnCollection.Count; i++)     {         string c = conexionADestination.InputCollection[0].ExternalMetadataColumnCollection[i].Name;         if (c.ToUpper() == col.Name.ToUpper())         {             IDTSInputColumn100 column = conexionADestination.InputCollection[0].InputColumnCollection.New();             column.LineageID = col.ID;             column.ExternalMetadataColumnID = conexionADestination.InputCollection[0].ExternalMetadataColumnCollection[i].ID;         }     } }   Save Package into the file system. app.SaveToXml(@"D:\Workspace\SSIS\Test_DB_To_DB.dtsx", Mipk, null);   Execute package. Mipk.Execute(); Conclusion In this article, we have explained one of the alternatives for creating SSIS packages using .NET console application. In case you have any questions, please feel free to ask in the comment section below.