From chaos to clarity: Overcome Azure data challenges

Effective data management is essential and fundamental to any organization aspiring to use information to enhance its business outcomes. Unfortunately, most organizations face the challenge of ineffective data management and as a result, suffer from inefficiencies and waste opportunities. Fortunately, such problems can be addressed through modern technologies such as Azure that transform the provision of data management solutions. At Advaiya Solutions, we help you in understanding the management of your data by utilizing the features offered by the Azure platform. The importance of effective data management Data management is important in handling data organization, storage, and analysis in a manner that aids the tactical resolution of problems and ensures competitiveness to business operations. In the absence of data management techniques, organizations face problems such as data hoarding, poor data standardization, and wastage of useful facts. These are some barriers which can negatively impact on your business in areas like responding to the changing market as well as streamlining the business processes. The challenges of ineffective data management Many organizations encounter several key challenges with data management that can lead to inefficiencies and missed opportunities: 1. Fragmented data storage Data fragmentation can be defined as techniques employed by an organization where information is stored in isolated systems or departments, making it hard to obtain a full picture of the information. Such fragmentation may result in redundant work, uncoordinated reporting and ineffective strategies. 2. Inconsistent data quality Data quality problems arise from the use of incomplete, inconsistent or obsolete data. This inconsistency undermines data-driven decisions on certain aspects of business performance. 3. Limited scalability Traditional data management systems tend to face certain limitations in terms of scalability as the volume of data increases. Without the ability to handle large amounts of data efficiently, businesses may face performance issues and higher costs. 4. Complex integration Integrating data from various sources can be complex and time-consuming, especially if the systems in place lack compatibility. Effective integration is crucial for ensuring that data is accessible and usable across the organization. How Azure transforms data management Azure provides a comprehensive suite of tools and services designed to address these data management challenges. By leveraging Azure’s advanced capabilities, businesses can enhance their data management practices and unlock new opportunities for growth. 1. Centralized data management  With the use of Azure, companies can aggregate data from its various sources to a single source. This helps to ensure that data does not have silos while also giving a comprehensive view of critical information, improving collaboration and decision-making. 2. Enhanced data quality With Azure, organizations can implement robust data governance and quality management practices. Azure’s tools help ensure that data is accurate, complete, and up to date, supporting reliable insights and informed decision-making. 3. Scalable solutions Azure’s cloud-based infrastructure offers scalable solutions that can grow with your business. Whether you need to manage large volumes of data or support complex data processing tasks, Azure provides the flexibility and scalability required to meet your needs. 4. Seamless integration Azure facilitates easy integration of data from diverse sources. With services like Azure Data Factory, businesses can automate data workflows, ensuring that data is seamlessly integrated and accessible for analysis and reporting. Leveraging Azure AI and ML for advanced data management One of the key strengths of Azure is its integration of Artificial Intelligence (AI) and machine learning (ML) capabilities. These technologies enhance data management by providing advanced analytics, predictive insights, and automation. 1. AI-driven insights Azure AI allows businesses to deploy machine learning models and generate advanced analytics based on data. Azure machine learning studio provides a robust platform for building, training, and deploying models, offering ready-to-use algorithms and real-time predictions. 2. Predictive analytics By leveraging machine learning models, organizations can forecast trends and make data-driven predictions. Azure’s AI solutions enable businesses to analyze historical data and identify patterns, providing valuable insights for strategic planning. 3. Automated processes AI and ML automation streamline repetitive tasks and processes, improving operational efficiency. Azure’s AI capabilities can automate data processing, enhance decision-making, and reduce manual efforts. Why choose Advaiya’s Solutions for effective data management At Advaiya Solutions, we specialize in harnessing the power of Azure to optimize data management for our clients. Our expertise in deploying AI and ML on Azure helps businesses overcome the challenges of ineffective data management and achieve clarity in their data practices. Here’s how we add value: 1. Comprehensive AI/ML deployment We assist organizations in deploying AI and ML solutions across their operations. From building predictive models to implementing real-time analytics, our team ensures that AI and ML are integrated effectively to support your business goals. 2. Customized data solutions Our approach involves tailoring data management solutions to meet the unique needs of your organization. Whether it’s integrating data from various sources or enhancing data quality, we provide solutions that align with your specific requirements. 3. Azure Active Directory integration Azure Active Directory (Azure AD) simplifies identity and access management for applications deployed in the Azure environment. By leveraging Azure AD, we ensure secure and efficient authentication, reducing the complexity of managing user access and credentials. 4. Real-time analytics and insights We implement real-time analytics solutions that provide timely insights and support proactive decision-making. Our expertise in Azure ensures that your data management practices are optimized for efficiency and effectiveness. Conclusion Effective data management is critical for unlocking the full potential of your data and driving business success. By addressing the challenges of fragmented data storage, inconsistent data quality, scalability issues, and complex integration, Azure provides a robust framework for transforming data management practices. At Advaiya Solutions, we leverage Azure’s capabilities to enhance your data management processes, enabling you to achieve clarity, efficiency, and actionable insights. Contact us today to discover how our solutions can help you overcome data management challenges and drive your business forward.

Office365 Operations with Azure AD Multifactor Authentication

Office365 Operations with Azure AD Multifactor Authentication

While working with Office 365 custom applications sometimes, certain non-browser applications do not support multifactor authentication. Eg: In Project Online/SharePoint Online for various custom operations we make use of ODATA, RESTand CSOM operations. To carry out these operations we require a service account, and it should have Azure AD multifactor authentication disabled. However, nowadays due to security concerns, organizations prefer to use service accounts also with multifactor authentication enabled which the custom applications do not support. I will be sharing here how we can handle and perform the custom project online operations with an account with multifactor authentication enabled. We can carry out the operations with multifactor authentication-enabled accounts in 2 ways: App Password Azure AD App authentication. Let’s discuss both the above methods: App Password: Using app passwords, the applications work correctly by bypassing multi-factor authentication and thus replacing the user’s regular credentials. Sign-in using app-password does not give any additional verification prompt and authentication is successful. These passwords are automatically generated which makes them secure. To create an app password admin needs to enable the app password feature. To generate the app password below steps can be followed: Sign in to Azure Portal(portal.azure.com) Go to Azure Active Directory Select Security under Manage. Select the Conditional Access option from the left. Select the Named location from the left. Click on Configure Multifactor authentication trusted IPs On the multi-factor authentication page, select Allow users to create app passwords to sign into non-browser apps Now, let’s see how we can generate an app password. Log in using your office 365 account and go to the My account page and select Security info. Under the Security Info page click on Add sign-in method. Choose the App password and add it. It will ask to enter a name for the app password. Copy the password for future use as it will not be shown again and then you will have to generate a new one. Click on done. You are now ready to use this app password in your applications. You can use this in any Office365 operations using CSOM, Odatawith MFA enabled account. You can delete the app password from the list under the security Info page or can create a new one. Creating the Azure AD App for Multifactor Authentication Here, I will be showing the configuration to perform Project Online operations using a multifactor authentication-enabled account using Azure AD App. This option will be helpful in case the app password also does not work due to security configuration. Using an Office 365 account log in to https://portal.azure.com.  From the home page select Azure Active Directory. If you already have an Azure Active Directory App you can use the same to set the required permission else can create a new one. Click on All applications to select the existing app. To create a new app, Click on Add and select App registration. Below screen will display. Give a name to the app and register. Keep the default setting for permission. The app will get created. Open the app and click on Authentication->Add a Platform and select Mobile and desktop application. Configure the redirect URL by selecting https://login.microsoftonline.com/common/oauth2/nativeclient  Enable Allow public client flows by selecting Yes and Save Select API permission from the left section. Then click on Add permission. A list to select the API will appear. Under Microsoft API select SharePoint to set the permission for Project operations. In a similar way whichever API we want to fetch data, we can grant permissions for that. Under this permission selects Delegated permission. Under Project Select Read and Project. Write permissions and add. If you only want read access, then can select only read permission. Permissions get added and now click on Grant Admin Consent to give the admin consent. Click on Overview and copy the Application ID i.e., the client Id that will be used in the application. The configuration is complete. Let’s see how to use it in non-browser applications. Here you can create a console application and put the below code. The things that you need to consider are: Add a reference to the latest MSAL (Microsoft.Identity.Client ) and Microsoft.ProjectServer.Client from Microsoft.SharePoint Online.CSOM. Need to define the scope which is the permission URL like“<SharePoint URL>/Project. Read. Here we will only get the list of projects so have used the read URL and we can use it as per the operations. The redirect URL is fixed and is the one which we have added under the AD app above. Use the Tenanat ID and client ID of the AD app. When we use the user credentials then we use SharePointOnlineCredentialsclass to set the credentials. Here, in this case, we will pass the token in the request header using the Authorization header as authentication will be done using the Azure AD app. private static void GetProjectsUsingCSOM() { string domain = “abc.sharepoint.com”; string PWAUser = “abc@domain.com”; string scope = “<SharePoint URL>/Project.Read”; string redirectUri = “https://login.microsoftonline.com/common/oauth2/nativeclient”; string pwaInstanceUrl = “PWA URL”// your pwaurl var AzureTenantId = “Enter the Azure Tenant value”; var ClientID= “ClientID of Azure AD app”; //Retrieved above PublicClientApplicationBuilder pcaConfig = PublicClientApplicationBuilder.Create(ClientID .WithTenantId(TenantId); pcaConfig.WithRedirectUri(redirectUri); // This section uses to get the token Var TokenResult = pcaConfig.Build().AcquireTokenInteractive(new[] { scope }) .WithPrompt(Prompt.NoPrompt) .WithLoginHint(PWAUser).ExecuteAsync().Result; // Load project context and get projects. ProjectContext projectContext = new ProjectContext(pwaInstanceUrl); projectContext.ExecutingWebRequest += (s, e) => { e.WebRequestExecutor.RequestHeaders[“Authorization”] = “Bearer ” + TokenResult.AccessToken; }; projectContext.Load(projectContext.Projects); projectContext.ExecuteQuery(); foreach (PublishedProject Project inprojectContext.Projects) { Console.WriteLine(Project.Name); } Console.ReadLine(); } Here, one thing that happens is, it will ask for a pop-up to enter the credentials if it is not cached. In a similar way, you can perform any Project Online/Office365 OData call also. All other calls can be carried out after adding the appropriate permissions in the AD app. Happy Coding! Supriya Khamesra Supriya is currently working as a Senior Consultant – Business Applications with Advaiya and has more than 10 years of experience in application development. Microsoft Certified Professional in EPM, Supriya has extensive knowledge in technologies including – SharePoint, Enterprise Project

Azure SQL Database DTU Calculator

Microsoft Azure SQL Database is a managed cloud database (PaaS (Platform as a Service) service provided as part of Microsoft Azure which runs on a cloud computing platform, and access to it is provided as a service, i.e., PAAS (Platform as a Service). This type of managed database service takes care of scalability backup with high database availability and does not require any version upgrading or patching process. Also, we do not have to even think about any hardware issues or other factors like CPU capacity (number of cores), RAM, hard disk, and licenses (Express, Standard, Enterprise). We just need to consider DTUs (Database Transaction Units) and the size of the database in on-premises. So, the capacity of the database is calculated in DTUs. Basically, the Database Transaction Unit (DTU) is the unit of measure in the SQL Database. For Example: If any of the customers have asked, how do we arrive at the required DTUs for our Database server if we need to migrate to Azure from On-Premises, then here is the tool named “Azure SQL Database DTU Calculator” that is used to calculate the required DTUs. In short, for simplicity, the DTU model has the advantage of lower price and accuracy of measurement for data while doing the migration, so you can get started at a lower price point with this model. How to use DTU Calculator? Measure resource utilization To measure the DTU Size for your database server, you will need to capture several performance metrics on your SQL server. To provide the most accurate measurement, you must run the respective production workload during a time period of at least 48 hours that captures the expected range of usage. Measure the following utilization metrics for at least 48 hrs. So the calculator can check the utilization over time to provide you the best recommendation: Processor – % Processor Time Logical Disk – Disk Reads/sec Logical Disk – Disk Writes/sec Database – Log Bytes Flushed/sec To capture the correct performance metrics, use one of the following utility and scrip (Command Line EXE or PowerShell Script) to capture your database utilization for 48 hrs. period at least. Need to take care following while running script in your production environment: 1. Before running the utility/script, ensure that no processes other than SQL are utilizing CPU, memory, and disk. 2. Click the link and download the zip file onto your SQL server and extract the contents. 3. If you are using the command line utility, right-click the .exe file and select “Run as administrator”. 4. If you are using the PowerShell script, navigate to the Windows PowerShell ISE and right-click to “Run as administrator”. Within Windows PowerShell ISE, browse to the SQL-perfmon.ps1 script file and click ‘F5’ to run the script. Related: Creating a schedule to run a SQL stored procedure from PowerShell script using Azure automation Assessment Demo of Azure SQL Database DTU Calculator. Step-1: Go to the DTC Calculator web page and download either method to collect the on-premises database performance counter utilization. Here in this example, we will use the Command Line Utility by using the Download Command Line Utility option. Step-2: For Changing Parameters: Here, the default Counters are collected at 1-second interval for 1 hour or 3600 seconds (about 1 hour). If you want, you can change these parameters, and the log file will be created as C:sql-perfmon-log.csv. To change these parameters, you can use the SqlDtuperfmon.exe.config to change the settings shown below. Want information about our ERP solutions? Consult with our experts Step-3: Upload your performance .csv file and fill in the no. of CPUs, and Click Calculate Step-4: Review Analysis & Result: After clicking on the calculate button, you must review the charts which provide an analysis of your database resource consumption. The charts show the % of time (based on your measurements) of your database’s resource consumption at each level. You can review CPU, Input-Output Operations, and Log individually as well as collectively to better understand which metric(s) affect the performance of your database. Here is the chart which shows the SQL Server performance utilization for CPU, Input-Output Operations, and Log. The Service Tier/Performance Level chart represents the frequency of time your database workload falls into the indicated service tier and performance level. The DTUs Over Time chart represents the database throughput unit (DTU) calculated during your different hours Also read: Event and trigger in Azure Blob storage

Event and trigger in Azure Blob storage

Binary Large Object, popularly known as Blob, is more than just a storage platform for a large amount of unstructured data. It has an enthralling feature of triggering an event or service every time a logic equals true. In this blog, I will cover how seamlessly the Blob Storage and Logic Apps can integrate to trigger a custom action on Azure Blob storage events. So, what we need for this? Azure Blob Storage – having data files in the form of excel sheets. Azure Blob Storage Events – an event that gets triggered when any file (in our case excel) gets uploaded or modified in blob storage. Azure Logic Apps – will execute the web-service whenever an Azure Blob storage event gets pushed, by its workflow automation. Custom action – which should perform when the Azure Blob Storage event triggers. Here let us take the execution of a service hosted on azure to perform specific tasks. The Azure Blob storage events are essential while managing the events, such as creation, modification, and deletion of records or files in blob storage. The Azure Logic Apps provides immense methods to automate workflows that integrate data across Azure Services. Now, let’s proceed with the steps to achieve the entire process of Azure blob storage event using Logic Apps. Note – Before starting, we assume that you are aware of the steps to create “Resource Group” and “Storage Account” and have already created on your Azure portal. Step 1: Enable resource provider for the Event Grid If you have not used “Event Grid” in your Azure subscription before, you may need to register the Event Grid resource provider. In the Azure portal: Select ‘Subscriptions’on the left menu. Select the Event Grid subscription you ‘d be using. In the left-hand menu under Settings, Select Network Providers. Now, look for EventGrid Select Register if it is not registered. Registration can take a moment to complete. Click Refresh for status change. When status gets updated to Registered, you are ready to continue. Step 2: Subscribe to the Blob Storage Subscribe to a topic to let the Event Grid know which events you want to track, and where to send them. Navigate on the portal to your Azure Storage account that you created earlier. On the left menu, select All resources and select your storage account. On the Storage account page, select Events on the left menu. Select More Options, and Logic Apps Step 3: Now create a Logic App 1. In the first step, select the subscription you have, Resource Type and Resource Name 2. In the Event Type Item, choose Microsoft.Storage.BlobCreated. You can add other parameters also. 3. In the next step, perform the action you want when a blog gets created. We took the custom action to call the web service. One can choose any other action from the available actions in the Logic App. Now upload the excel in the blob storage, and the event will get triggered, which will run the logic app. If you are planning to create such customizations, connect with Advaiya to get a consultation from experts. Read More- Data warehousing and infrastructure with Azure SQL