How manufacturing companies gain value from BI  

power-bi

Business intelligence (BI) has become important in anticipating customer needs and wants, designing the product and process for the same, and deliver to specifications profitably. A manufacturing company must know which product is doing well in the market, which product is in demand, forecast demand based on past needs, purchase and stock the raw materials, store the finished goods in inventory, keep track of the sales, track payables and receivables, measure profit/loss and do more. If you are looking for a solution to get quick and valuable insights into the aspects mentioned above, BI with its features like visualization, reporting, data discovery, scorecards, and dashboards can be your best investment.  Here are ways in which Business Intelligence can benefit the manufacturing industry:  Informed decision-making  Manufacturing industries today are processing copious amounts of data from multiple sources, and hence arises the need for the right processing and utilization of data. BI is a powerful tool in the hands of businesses who would want to extract and convert data from multiple disparate sources to gain meaningful insights. Data visualization tools can be used to get insights that can be presented in a simplified manner, along with key business matrices and KPIs. Insights help the decision-makers to make more informed business decisions.  Improves operational efficiency  Manufacturing units can use easy-to-navigate dashboards, comprehensive reports, and scorecards to improve efficiency. With the help of business insights, industry units can improve operational efficiency, reduce waste, enhance the quality of the product, reduce labor, material and overhead costs, negotiate effectively with suppliers, and present winning quotes to the customers.  Financial management   BI tools can be used for sales analysis, profit and loss analysis, raw material analysis, which can help in optimizing resources and increasing ROI. BI can be used in identifying unexplored channels of revenue and minimizing internal costs leading to increase in profit margins and reduction in unnecessary expenses. BI also helps in carrying out an in-depth cost-benefit analysis, demand-supply analysis, and helps to streamline operational procedures by monitoring and managing processes.  Supply chain and logistics management  By evaluating performance on a regular basis, BI can help in the management of supply chain logistics and analysis of data to ensure quality and timely deliveries, monitor freight costs by identifying fluctuation in supply and demand and optimize the value of suppliers by giving feedback on their services. Thus, BI can help in shipment performance evaluation and contract negation.  Inventory control  One of the most crucial aspects of a manufacturing firm is inventory control and management. BI can help in tracking and reducing the inventory costs, avoid the risk of out-of-stock situations by analyzing safety stock data, create accurate forecasts using sales information, and over-stock conditions can be predicted well in advance.  Conclusion  A good amalgam of manufacturing and technology by utilizing data is changing the way businesses used to work. Business Intelligence is used for visualizing and analyzing data to meet the data consumption needs of the business. If you want to gain deeper insights with data analytics to meet your information needs and make better decisions with BI solutions, you can get in touch with our BI experts.    Related services & solutions Scorecards and dashboards Read more Data discovery and aggregation Read more Data visualization and reporting Read more Actionable analytics Read more Intelligence Read more Business analytics with Power BI, … Read more Decision making Read more Automation with Microsoft Power … Read more Business productivity Read more Insights Read more Related Blog posts Don’t just survive, grow through the crises with Power BI The next big thing in business applications Business analytics to supercharge sales Gain more value by prioritizing your data initiatives How manufacturing companies gain value from BI   Benefits of business intelligence in the construction industry Gain advantage of a phased approach for your next BI project Business Intelligence trends to watch for in 2019 How to embed a Power BI report into an application for your customers Business intelligence vs. business analytics: Where BI Matches into your Corporate strategy 4 Things About BI Reporting Your Boss Wants to Know 5 ways to turn business intelligence into business growth Why Microsoft Power BI is the leader in business analytics? Powerful Decision Making with BI Updates Advaiya Announces New Virtual Events for Businesses, IT Personnel Seeking Latest Insights About Both Dashboards and Cloud Migration Microsoft positioned as a leader in the Gartner 2020 Magic Quadrant for Analytics and Business Intelligence Platforms [Press release] Advaiya announces 1,180 executives from 553 companies have now received training at its “Dashboard in a Day” workshops Decision making News about Advaiya

Gain advantage of a phased approach for your next BI project

Implementing a business intelligence (BI) solution is a daunting task which involves considerable investments in terms of time, energy, and capital for right implementation. Despite all its capabilities, if not implemented properly, it will result in unnecessary delays, cost overruns, data quality and consistency issues, and disconnected end users. Hence, it is crucial to have the right strategy and approach in place. An organization-level BI implementation works best when it’s done in a staged or phased manner. When you implement BI with a phase-based, collaborative approach, each phase will have a significant milestone and a building block with well-defined objective and purpose. Well defined milestones can be tracked for success at each stage generating the opportunities for the next. Here are the key phases that must be followed when implementing a business intelligence solution: Phase 0 – Information needs discovery and planning This stage is a must! Before the project kick starts, it is crucial to invest time to understand the business context and priorities to avoid rework as we progress. An in-depth assessment of existing infrastructure helps to identify gaps, discover current challenges and areas of improvements. The key activities to be taken up in this phase are information needs discovery and discussions with all the stakeholders. This step will help us figure out the business problems that can be addressed in the subsequent phases and formulate a strategic roadmap in line with the business priorities. Phase 1 – Data discovery and aggregation This phase mainly focuses on understanding the organizational data sets thoroughly and identifying how to pull data from all the critical source together for analysis by keeping in mind the real constraints to implement these into the business processes. Assuming that the data can be easily gathered and analyzed could result in delays during BI implementation. Data discovery and aggregation is the most challenging stage because it requires a thorough understanding of the business, its requirements, and how to collect the relevant information; because raw data imported from scattered systems can be barely analyzed as-it-is. So, the data needs to be cleansed, structured, and normalized using some state-of-the-art tools and techniques to create a centralized data warehouse. While designing the centralized data mart, it is essential to look carefully at every field, learn about the relationships between variables, create the most suitable models, and make the data reliable and ready for analysis. Phase 2 – BI implementation and data visualization It is always recommended to utilize the industry standards and proven agile methodologies for BI implementation. This will help in easy implementation of business intelligence platform and integration with other line-of-business (LOB) applications, in a streamlined way. Once implemented successfully, creating dashboards is no easy task. It is an art that requires much more than just knowing how to use the tool. Data visualization strategies like which visual will communicate the results better, who will use the dashboards, how will the end-users be interacting with them, and what queries do the users expect the dashboards to answer, helps design the dashboards and reports in-line with what is expected. Once designed and developed, it is essential to validate the data with a cross-functional team for accuracy and quality before going-live. Phase 3 – Training and support – Rollout and going live This phase emphasizes on determining the organization's readiness to adopt the new BI system. The rollout phase includes activities like training in the context of the organization and documentation for different roles like end-users, administrators, etc. BI user adoption is a key indicator of a successful BI initiative. Most of the users in the company would be more interested in knowing how to use the solution rather than how it has been built. Also, instead of implementing the solution to the whole company at once, it is recommended to roll out BI to individual departments and/or business areas to ensure smooth and successful adoption. Positive reviews about the solution from the new users can lead to excitement in adoption among other departments. Support phase consists of post-production implementation and adoption activities like continuous monitoring and support on data volumes, report updates, dashboard usage, etc. to ensure the system works as per the business expectations. This phase also discovers the opportunities to update and scale the system Phase 4 – Advanced analytics and actions Every business in today’s data-driven world is looking to reap the benefits of predictive analytics, artificial intelligence (AI), machine learning (ML) and other cognitive computing to gain competitive advantage, drive faster decision-making, boost efficiency, and increase customer satisfaction. Advanced analytics is the foundation to unlock the insights you've been searching for. With a holistic approach to the advanced analytics, you can get the right data, act on it, and make sense out of it. To transform your organization into an intelligent workplace, ready for today and the future and realize the potential your data offers, you must ensure that appropriate data management and data governance policies are in place. Breaking down the requirements into phases makes the process of BI implementation more manageable. Instead of having one large deliverable, the phased approach makes the team more productive as the milestones are well defined, and the development process is more iterative with frequent interaction with the project stakeholders that allows smooth implementation. When all is said and done, one has to be aware of the fact that BI adoption and implementation is not a one-time activity. As your business grows, you will have to refine your BI systems and strategies to accommodate the new changes. We assist our clients in every step of their BI journey in becoming data-driven, by providing everything from specific expertise on discrete issues to holistic transformations spanning strategy building, implementation, integration, reporting and visualization, training and adoption, and much more! Need help in planning or upgrading your BI implementation? Talk to our experts and get your business analytics maturity report.

Sitecore Helix and Habitat – creating a new module/project

Helix is a set of overall design principles and conventions for Sitecore development. Helix gives developers some architectural guidelines that have emerged out of experience from past Sitecore implementations The core focus of Helix is the separation of functionality into modules – a conceptual grouping of all assets related to a single business requirement and their organization into layers. This organization is beneficial because it reduces dependency between modules, minimizes side effects and additional efforts in later iterations. Helix defines some rules on how to organize the filesystem and items in the content tree. When we follow these rules, we can be sure that other developers can easily navigate the solution we built. Habitat is nothing but a way on how to implement Sitecore solution based on Helix design principles. Habitat is a real Sitecore project, built using the overall design principles and conventions provided by Helix. Habitat solution for Sitecore v9 is available on Github, and  Habitat demo site is also available online for reference. Sitecore Habitat solution can be set up by following the steps which are explained in many articles and videos available online. This article describes the steps to add a new module/project in Sitecore while following Helix principles and using Habitat solution. Creating a new module/project Here we are taking an example of creating a new project by the name Ports in the Sitecore Habitat solution project section. Any module in Feature and Foundation can be created in the same way: 1. Create a folder structure: a. Open windows file explorer and navigate to the path for existing Sitecore solution C:VSTSAdvaiyaSitecoreCodeAdvaiya b. Inside the above path, create the following folder structure srcProjectPorts and srcProjectPortsserialization 2. Create a new project in Visual Studio: Open Visual Studio 2017 in administrative mode a. Open the Sitecore Habitat solution from the location C:VSTSAdvaiyaSitecoreCodeAdvaiya b. Create a solution folder by the name “ports” inside “project” section c. Create a project in the newly created “ports” folder: i. Right click “ports”, add à new project ii. Use the template “ASP.NET web application” iii. Select the project location inside the file system as below C:VSTSAdvaiyaSitecoreCodeAdvaiyasrcProjectPorts iv. Name the project as “code” to maintain the correct folder structure. We will rename it to project in the next steps. v. Click next, now select “Empty” project template and check the “MVC” checkbox vi. Click “OK” 3. Setting properties to the new project: a. Rename the project to Sitecore.Ports.Website b. Go to the project properties and set the assembly name and namespace to Sitecore.Ports.Website c. Update the project framework to “.NET Framework 4.6.2” d. Delete the folders from Project App_Data, App_Start, and Global.asax e. Go to properties “web.config” file and “/views/web.config” files, and set “Build Action” property, to “None”. By setting this property you will not publish these files with the project. 4. Add configuration files: a. Create the following folder structure in the Visual Studio under Sitecore.Ports.Website as below: /App_Config/Include/Project and /App_ConfigEnvironment/Project 5. Create a publish profile: a) Copy the”/Properties/PublishProfiles” folder from the other project. b) Paste it at the same place in the new project Sitecore.Ports.Website project. 6. Copy required assemblies from another project: a) Unload the project, edit the project file. b) Unload any other project, edit the project file. c) Compare both project files and add assemblies in the new project from another project. d) Comment target location at the end of the project file, comparing with another project. e) Reload the projects. f) Re-Build the new project, and it will add all the required assembly references now. 7. Create required folders in CMS a) Go to Sitecore CMS, enter login details and open content editor b) Create required project folder inside: i. Templates ii. Media iii. Content iv. Layout v. Rendering vi. Placeholder settings vii. Models viii. Forms 8. Check configurations and sync: a) Open Unicorn.aspx page b) Check if all configurations are done for the newly created project/module c) If not, then check for errors and resolve them one by one d) Re-serialize newly created project/module e) Sync project/module The newly created project or module is ready. If you need to create a new project for a website in multisite solution, then IIS binding and host file entry for the new website has to be done. If you need any assistance for Sitecore implementation or in the creation of a project/module our team of experts can help you.

Key features of Sitecore Experience Platform 9.1

Sitecore Experience Platform 9.1 has a lot of exciting capabilities, refinements and enhancements to the existing functionalities. It has got many innovations and values that will help customers get to the market faster. In addition to key platforms and infrastructure enhancements, Sitecore 9.1 focuses on five key areas: Sitecore Identity Sitecore Host Headless – Sitecore JavaScript Services (JSS) Sitecore Cortex Sitecore Accelerator Framework (SxA) Sitecore Identity Federated authentication service is the most exciting feature in Sitecore 9.1 version. There is a new login screen, enabling Single Sign-On (SSO) with Sitecore Experience Manager, Sitecore Commerce, and Horizon. This is the new UI released along with this version. In all the earlier versions of Sitecore, authentication of a user has always been done using form authentication in ASP.NET Membership with a simple implementation. With Sitecore 9.1, a new module has been launched which contains active directory integration. Federated authentication is widely used across the industry, and Sitecore 9.1 has finally adopted this feature which provides user authentication and authorization through a centralized federation service. This feature runs as a separate application and provides single login access. This is an in-built module which comes with Sitecore 9.1 and reduces the efforts of developers in integrating or installing different third-party modules to support different login systems. Sitecore Host The Sitecore host which is based on .Net care framework is the base for services, and it takes care of a number of concerns. Sitecore host handles those aspects of services or apps which are not directly linked like application logging, configuration management, Database access, etc. The advantages of running a service built on Sitecore Host are: The experience is consistent on all Sitecore host applications. The experience of installation is constant. Lean common runtime for Sitecore.Net core Services/Apps It is possible to configure applications on the command line, without the need for UI. It is possible to extend features using the host plugin. The behavior of the host is the same on cloud and on-premise. Headless – Sitecore JavaScript Services In Sitecore, coordination between the front end and backend developers has always been difficult. But, Sitecore 9.1 supports JavaScript Services (JSS) which provides native developer experience. Here the front-end developers can consume Sitecore content with the most common JSS frameworks like React, Angular, and Vue. In addition to JavaScript services integration, JSS provides developers access to Sitecore backend services like layouts and personalization features including xDB/xConnect capabilities. Thus, the front-end developers can begin creating components separately using JSS, and this increases the possibilities of creating single page applications, native mobile applications experiences without sacrificing on any feature of the Sitecore experience platform. Sitecore Cortex Sitecore Cortex is a machine learning program for Sitecore. This feature was already present in the earlier versions, but the new Sitecore 9.1 version brings the next level of machine learning innovations with customizable and extensible data processing engine to process data and train models accordingly. Sitecore Cortex is a combination of advanced machine learning algorithms and processing engine enabling rapid implementation of ML/AI based technology. It provides personalization recommendations and offers automatic content tagging capabilities for any product or customer. This feature helps in improving SEO results; the company can get better ROI and reduce guesswork from testing and optimization. It enables advanced search and highly-refined personalization with proper training and extension Cortex also provides plugged architecture which supports other ML which means other ML engines can also be utilized. Sitecore Accelerator Framework (SxA) Web Content Accessibility Guidelines (WCAG) are a set of web standards that enable web and related content to be accessible to users of all backgrounds. Sitecore 9.1 accelerator framework supports industry standard WCAG 2.0 accessibility guidelines. This framework will allow customers to easily build solutions using experience accelerator which complies with accessibility guidelines, thus enhancing the reach of experience accelerator to regulated industries such as public sector and healthcare. There are many other updates introduced in Sitecore 9.1 like: Enhancements in experience analytics Enhancements in email experience manager – New content for e-commerce and marketing automation Project Horizon new UI XM-only platform deployments for simpler CMS needs Support for Solr 7.2.1 All these changes in Sitecore 9.1 helps in better implementation of Sitecore solution by developers and expands the capabilities of the platform for businesses.

Quantarium’s REAL AI for REAL Estate

Quantarium’s REAL AI for REAL Estate

At Advaiya, we have the pleasure of running across dozens of amazing companies- in technology, services, and product development.  These companies range from fledgling startups to the world’s largest technology companies.  With such a large range at play, one might ask what the commonalities are.  Indeed, that is the million-dollar question- what is the taproot of greatness in companies of any size? A common answer is “People.”  Great people make great companies, or so the logic goes.  We believe this is true but only partly so.  We believe strongly that the sine qua non of greatness is a combination of people and timing.  Markets are dynamic- they change. No doubt, this dynamism does not necessarily spell doom for the incumbents but it does bring into question the need for timed adaptation. The history of business is littered with the carcasses of projects and companies that were “too soon” or “too late.”  These stories needn’t bear recitation- we’ve all heard about these “Kodak Moments” a lot.  Given this, we get excited when we find companies that seem to have their timing perfectly aligned with Market dynamics. One such company is Quantarium, an Artificial Intelligence company in Bellevue, WA.  With a team of Data Scientists, Mathematicians, Commercial Executives, and Entrepreneurs, Quantarium applies AI to the vast and important residential real estate industry.  The numbers are staggering- residential real estate is the world’s largest asset class, with a world-wide estimated value of almost $200 Trillion.  In the United States, this number is closer to $30 Trillion. Despite the size of the asset-class and market, real estate data is still a developing business.  Traditional parts of the Real Estate value chain are often isolated from one another and no particular part has comprehensive, timely, and contextual data at its disposal- in an actionable and time-sensitive set up.  This is partly due to infrastructure, partly to culture, and partly because the questions surrounding Real Estate are incredibly complex.  Indeed, we find that AI, Deep Learning, and scientific Big Data management are necessary to create an accurate and repeatable analytics platform in this multi-variate industry. That’s why we love Quantarium.  With the largest and most sophisticated Data Lake in the business and with the speed, scale, and insight driven by AI, they are enhancing the industry.  Consider for instance their AVM solution– the best in class- which helps mortgage owners, Banks, and other financial institutions value their residential real estate assets or their Portfolio Analytics that allows for propensity models, if/then scenarios, and other focused and relevant solutions- game-changing solutions that are directly applicable to today’s housing market. This brings us back to the notion of timeliness.  Not all claims around Data and AI were paid off in reality; claims abound.  Quantarium however developed these solutions painstakingly and got them right before releasing them into the wild.  We believe timing is on their side. We are impressed but not robotic cheerleaders. Much remains to be seen, but from our vantage point, Quantarium is a Real AI company that offers Real value to one of the largest markets on the planet. We, at Advaiya, are happy to be a trusted partner of Quantarium supporting their solution vision for Data and AI in the Real Estate industry.

Business Intelligence trends to watch for in 2019

Business Intelligence is set to become even more thrilling as Artificial Intelligence will continue to be the toast for tech professionals all over the globe. Companies will no longer look at the benefits of adopting analytics but look out for the best BI solution for their industry. Here are the top BI trends to look out for in 2019 that will add value to your company: 1: Customer Experience Customer analytics will be one of the primary focus areas for business intelligence in 2019. Customer journey analytics, speech analytics, emotion detection, customer engagement center (CEC) interaction analytics – awaits service leaders to visualize and connect the customer journey across multiple devices and channels. Consumer expectations are rising and supporting their requirements with a colossal amount of data will be a challenging task. Analyzing and predicting the user’s behavior will go hand in hand with data management and Artificial Intelligence. Becoming data-driven, using business analytics tools to adopt the most effective way of decision-making will become essential for the sustainable growth of a company. 2: Growing Importance of The CDO & CAO Today, Big data and analytics are becoming vital for every business. Every company has had a Chief Information Officer who administers all the information management assets and security issues. But today, analytics roles are becoming so important that a new position has emerged with time: Chief Data Officer (CDO), and Chief Analytics Officer (CAO). The role of a Chief Data Officer is to provide users with data that is clean and ready-to-use. Also, CDO manages data assets of a company and improve the efficiency of data analytics. Another role that is of importance is that of a Chief Analytics Officer. While it complements the role of the CDO and CIO, it is becoming a sought-after role – “If the CDO takes care of data enablement, then the CAO is about how you drive insights off that data and make data actionable.” 3: Increased Investment and Adoption of Artificial Intelligence Artificial intelligence has taken center stage, and analysts believe that if companies have not started adopting it, then they might not catch up with the competition in 2019. Research shows that 38% of the businesses have already implemented AI in some form, and an even more significant percentage of the companies are assessing the technology’s worthiness. Artificial intelligence platforms can analyze data inputs faster than humans and uncover hidden insights which people might miss without relying on technology. Some AI tools can provide information directly from the customers in a streamlined way. As per a poll conducted in 2016, 80% of companies were already using chatbots or wanted to do so by 2020. This is because businesses can derive valuable things from chatbots, mainly by paying attention to the words people use and the sentiments expressed. Companies that will wait for too long to deploy it will risk falling further behind their peers. 4. Self-Service BI Interfaces With the boom of self-service BI interfaces, the role of data scientists will end, who used to take deep dives into their analytics. One of the researches also revealed that companies consider business intelligence as something that differentiates them in the marketplace and helps them to promote a data-driven culture in the organization. If companies have not started using BI, self-service software can be a smart way to begin. However, before planning to invest in such tools, businesses must evaluate what they aim to learn from BI and how such insights fit into their overall business operations. 5. Human Learning Styles Humans use a mixture of sensory inputs to learn – often defined as three learning styles – auditory/reading, visual or kinesthetic. By the end of 2019, business intelligence will be making use of information delivery mediums to use all these learning methods. For example, for auditory learners, auto-generated narratives in a written or spoken format which describes the pattern of the data selected will come into existence. Similarly, 3D printing can play a significant role in creating maps for the kinesthetic learners because they work best when they can physically feel something. In the case of visually-led learners, the options will grow, where they can take advantage of high-resolution displays to enable the rendering of enormous data sets, and perhaps virtual reality experiences. Let’s tighten our seat belts and get thrilled by what this new year will bring.

Sitecore 9 installation on a local development environment

From the time Sitecore 9 was released, there has been a lot of talk about the method of installation. Installing Sitecore 9 is difficult when compared to installing the previous versions, and in this version, you will be using a lot more Powershell. When I was trying to install Sitecore 9.0 in my system, I had to go through multiple website and videos to learn the method. To simplify this process, I have mentioned the steps to be followed for installation of Sitecore 9.0 on a local development environment. Sitecore Experience Marketing Demystified a. Windows 8.1, 10, Server 2012 R2 b. .NET Framework 4.6.2 c. SQL Server 2016 or greater d. IIS 8.5 or 10 e. Java Runtime Environment (JRE) f. Power Shell 5.1+ g. Web Platform Installer 5.0 h. Web deploy 3.6 i. Solr 6.6.2 j. NSSM Steps 1 1. Gather Files: a. Sitecore License file b. On-Prem XP0 Instance package: Sitecore 9.0.2 rev. 180604(download from here) Step 2 • Create install folder c:/Siteocore/install • Move license file to install folder • Extract ZIP file (Sitecore 9.0.2 rev. 180604) and move below file to install folder 1: Sitecore 9.0.2 rev. 180604 (OnPrem)_single.scwdp.zip 2: Sitecore 9.0.2 rev. 180604 (OnPrem)_xp0xconnect.scwdp.zip 3: Extract XP0 Configuration files 9.0.2 rev. 180604.zip file and move all its json file to install folder • Extract Solr 6.6.2.zip file and move to C:/Sitecore folder • Open PowerShell as administrator and move to C:Sitecoresolr-6.6.2bin directory • Run powershell command: o .Solr start -p 8983 • Open browser with URL http://localhost:8983/Solr and check solr is running properly Step 4 • Set up solr as service o Download from http://nssm.cc o Install to C:/Sitecore folder o Open Command Prompt as admin and navigate to C:Sitecorenssmwin64 o Run PS command: nssm install solr6 Under Application Tab fill values: • Path: C:Sitecoresolr-6.6.2binSolr.cmd • C:Sitecoresolr-6.6.2bin • Argument: start -f -p 8983 Under Details Tab fill values • Name: Solr6 • Description: Start/Stop Solr Service Click Install Service button o Open Service Manager to check Solr6 service is created and then start the service Step 5: Configure SSL for solr: • Download solrssl.ps1 from here • Move solrssl.ps1 file to install folder C:/Sitecore/Install • Edit the solrssl1.ps1 file o Open the file in editor o Navigate to $keytool = (Get-Command ‘keytool.exe’).Source o Set the location of keytool.exe to $keytool = (Get-Command ‘C:Program FilesJavajre1.8.0_191binkeytool.exe’).Source • Open the PowerShelland navigate to C:/Sitecore/Install and run the command .solrssl.ps1 -KeystoreFile C:Sitecoresolr-6.6.2serveretcsolr-ssl.keystore.jks o Open C:Sitecoresolr-6.6.2binsolr.in.cmd in editor and Search for this line and remove REM REM set SOLR_SSL_KEY_STORE=etc/solr-ssl.keystore.jks REM set SOLR_SSL_KEY_STORE_PASSWORD=secret REM set SOLR_SSL_KEY_STORE_TYPE=JKS REM set SOLR_SSL_TRUST_STORE=etc/solr-ssl.keystore.jks o Restart solr service o Now run solr as https https://localhost:8983/solr Step 6 Setup SQL Server User • Create a user with new User and provide Sysadmin server role • Execute below code sp_configure ‘contained database authentication’, 1; GO RECONFIGURE; GO Step 7 Install the SIF Module • Open the Powershell as administrator and run below command Register-PSRepository -Name SitecoreRepo -SourceLocation https://sitecore.myget.org/F/sc-powershell/api/v2 Install-Module SitecoreInstallFramework -RequiredVersion 1.1.0 Install-Module SitecoreFundamentals -RequiredVersion 1.1.0 Import-Module SitecoreFundamentals Import-Module SitecoreInstallFramework Step 8 Run Installation Script • Update the parameter in all the script and then run: o $prefix: name of Sitecore instance o $SqlServer: Sql server name o $SqlAdminUser, $SqlAdminPassword: User created in Step 6 Script 1: #define parameters $prefix = “xp1” $PSScriptRoot = “C:Sitecoreinstall” $XConnectCollectionService = “$prefix.xconnect” $sitecoreSiteName = “$prefix.local” $SolrUrl = “https://localhost:8983/solr” $SolrRoot = “C:Sitecoresolr-6.6.2” $SolrService = “solr6” $SqlServer = “DESKTOP-K34H1RL” $SqlAdminUser = “sc9” $SqlAdminPassword=”pass@word1″ #install client certificate for xconnect $certParams = @{ Path = “$PSScriptRootxconnect-createcert.json” CertificateName = “$prefix.xconnect_client” } Install-SitecoreConfiguration @certParams -Verbose #install solr cores for xdb $solrParams = @{ Path = “$PSScriptRootxconnect-solr.json” SolrUrl = $SolrUrl SolrRoot = $SolrRoot SolrService = $SolrService CorePrefix = $prefix } Install-SitecoreConfiguration @solrParams #install solr cores for sitecore $solrParams = @{ Path = “$PSScriptRootsitecore-solr.json” SolrUrl = $SolrUrl SolrRoot = $SolrRoot SolrService = $SolrService CorePrefix = $prefix } Install-SitecoreConfiguration @solrParams   Script 2: #define parameters $prefix = “xp1” $PSScriptRoot = “C:Sitecoreinstall” $XConnectCollectionService = “$prefix.xconnect” $sitecoreSiteName = “$prefix.local” $SolrUrl = “https://localhost:8983/solr” $SolrRoot = “C: Sitecoresolr-6.6.2” $SolrService = “solr6” $SqlServer = “DESKTOP-K34H1RL” $SqlAdminUser = “sc9” $SqlAdminPassword=”pass@word1″ #install client certificate for xconnect $certParams = @{ Path = “$PSScriptRootxconnect-createcert.json” CertificateName = “$prefix.xconnect_client” } Install-SitecoreConfiguration @certParams -Verbose #deploy xconnect instance $xconnectParams = @{ Path = “$PSScriptRootxconnect-xp0.json” Package = “$PSScriptRootSitecore 9.0.2 rev. 180604 (OnPrem)_xp0xconnect.scwdp.zip” LicenseFile = “$PSScriptRootlicense.xml” Sitename = $XConnectCollectionService XConnectCert = $certParams.CertificateName SqlDbPrefix = $prefix SqlServer = $SqlServer SqlAdminUser = $SqlAdminUser SqlAdminPassword = $SqlAdminPassword SolrCorePrefix = $prefix SolrURL = $SolrUrl } Install-SitecoreConfiguration @xconnectParams   Script 3: #define parameters $prefix = “xp1” $PSScriptRoot = “C:Sitecoreinstall” $XConnectCollectionService = “$prefix.xconnect” $sitecoreSiteName = “$prefix.local” $SolrUrl = “https://localhost:8983/solr” $SolrRoot = “C: Sitecoresolr-6.6.2” $SolrService = “solr6” $SqlServer = “DESKTOP-K34H1RL” $SqlAdminUser = “sc9” $SqlAdminPassword=”pass@word1″ #install client certificate for xconnect $certParams = @{ Path = “$PSScriptRootxconnect-createcert.json” CertificateName = “$prefix.xconnect_client” } Install-SitecoreConfiguration @certParams -Verbose #install sitecore instance $xconnectHostName = “$prefix.xconnect” $sitecoreParams = @{ Path = “$PSScriptRootsitecore-XP0.json” Package = “$PSScriptRoot Sitecore 9.0.2 rev. 180604 (OnPrem)_single.scwdp.zip” LicenseFile = “$PSScriptRootlicense.xml” SqlDbPrefix = $prefix SqlServer = $SqlServer SqlAdminUser = $SqlAdminUser SqlAdminPassword = $SqlAdminPassword SolrCorePrefix = $prefix SolrUrl = $SolrUrl XConnectCert = $certParams.CertificateName Sitename = $sitecoreSiteName XConnectCollectionService = “https://$XConnectCollectionService” } Install-SitecoreConfiguration @sitecoreParams After Successfully running below scripts, you can access Sitecore site using http://xp1.local URL This method can be followed for installation of Sitecore on a local development environment. You can also view Sitecore Multisite Architecture – Single Installation. In case of query, add your comments below.

How to embed a Power BI report into an application for your customers

How to embed a Power BI report into an application for your customers

Power BI Embedded in Azure provides us with the ability to embed reports, dashboards or tiles into an application by using app owns data. App owns data typically means having an application that uses Power BI as its embedded analytics platform. As an ISV developer, you can create Power BI reports in an application that is fully interactive, and the end users of the application will not require a Power BI license to view the report content. Why do you need analytic solution embedded into your app? Embedding Power BI helps ISVs or developers to embed visuals into customer applications and assist them in making better decisions without building an analytical solution. Embedding analytics helps business users to access their business data and perform required actions to generate valuable insights using this data within the application. Minimize development efforts and achieve faster time to market with your application. Spend more time focusing on your product rather than developing visual analytics features from scratch. Deliver value for your customers by easily exploring data and gaining insights from anywhere. In this blog, I will explain how you can integrate a Power BI report into an application using Power BI .NET SDK. Here you are using Power BI Embedded in Azure for your customer using app owns data. Prerequisites A Power BI Pro account, sign up for a free trial A Microsoft Azure subscription, create a free account Set up your own Azure Active Directory tenant setup. Visual Studio (version 2013 or later). Part 1 – Registering your application in Azure Active Directory (Azure AD) To establish an identity for your application and to specify permissions to Power BI REST resources you need to register your application in Azure AD. To register your application, your Azure AD should also have the rights to give permissions. Go to Azure Portal and then go to Azure AD. Go to App Registrations and click on New application registrations. Follow the prompts and create a new application. After filling this up, it should look like this. You need to enable additional permissions for your application. Click on Settings of the application you just created and click on Required Permissions Click on Windows Azure Active Directory and click on all the Delegated Permissions and click on Save. Now click on Add and add Power BI Service API Select all permissions under Delegated Permissions and Select Save when done. Remember you can only grant the permissions if your account has Global Admin rights. Part 2 – Setting up Power BI Service Environment To embed reports for your customers, you need to place your reports into an app workspace. The master account must have admin rights for the workspace. Create an app workspace by selecting workspaces > Create app workspace. This is where you are going to publish your Power BI report. Open your Power BI file in Power BI desktop and publish the report to the workspace you have created. Now, you can view the report in Power BI service. Part 3 – Embedding the report into your application For demo purpose, download the App Owns Data sample application code provided by Microsoft. To embed the Power BI report into your application, you need to fill five fields in order to run the application successfully. These are application ID, group ID, report ID, power BI username and password. Open the Web.config file in the sample application; this is where you must fill all these IDs. Fill in the application ID field with the application we created from Azure. The application ID is used by the application to identify itself to the users from whom you're requesting permissions. In the Azure portal, select All Services and select App Registrations and click on the application you have created in Part 1. We need to take Application ID from here. Now, open your Power BI report in Power BI service. From this URL you will get workspace ID and report ID. From this URL, workspace ID will be 1a8d021d-9bfd-47a0-966f-c2d181449b26. From this URL the report ID will be 8ecc94fa-fe19-41f5-968e-725989ede165. The first ID written after groups is the Workspace ID while the second ID written after reports is the Report ID. So, now we have all the three IDs required. Put them in the Web.config file along with the username and password of the Power BI master account. Then, Run the application and you can view the report in the sample application. Part 4 – Move to production After you’ve completed developing your application, you have to back up your app workspace with a dedicated capacity. This dedicated capacity is required to move to production. A dedicated capacity will give you an advantage of having a dedicated resource for your customer. You can purchase a dedicated capacity at Azure portal. Use the table to determine which Power BI Embedded capacity is suitable for your need. Sign into the Azure portal and select Create a resource> Data + analytics. Search for Power BI Embedded and within Power BI Embedded, select Create. Fill in the required information and then select Create. Assign your app workspace to dedicated capacity. Finally, after you have created a dedicated capacity, assign your app workspace to that dedicated capacity. In the Power BI service, expand workspaces and click on the ellipsis for the workspace you're using for embedding your content and then click on Edit workspaces. Click on Advanced and enable Dedicated capacity, then select the dedicated capacity you have created and click on Save. These are the steps to embed a Power BI report into your customer’s application. We have started by registering our application in Azure AD, then we have setup our Power BI environment. Then we have embedded our report into the application by updating our application’s code. Finally, to move to production, we have created a dedicated capacity to backup our app workspace and our Power BI report is embedded into our application. Get in touch with us to meet your specific information needs and make better decisions with Power BI solutions.