Accessing Angular jQuery method in VBA

To perform custom operations with any Office365 technology, we make use of jQuery/JavaScript. In one of our client’s functionalities, we came across a need to call the jQuery method using VBA (Visual basic for applications) macro. VBA is a programming language that helps us in custom operations for various Microsoft Office tools. We have implemented it for project professional. A SharePoint online page was loaded in project professional VBA form using a web browser control. On this SharePoint page, we have used Angular JS + jQuery to perform our custom operation. For displaying some data on the page, we need to fetch some data from project professionals onto our custom page. To do so, we need to call a jQuery method using VBA. As we have used Angular JS, we have to call the method of the controller class after that. I will share with you how we carry out this operation. In order to call a jQuery method, we have to keep in mind that the web browser controller should be loaded completely else; while calling the method from VBA, it will not find the method and will throw an exception. Let’s look at the code snippets to carry out the operations: In VBA, we have to use the below code: Sub CallJqueryMethod Dim currentWindow As Object Dim selectedValues As String Set currentWindow = frm1.WebBrowser1.Document.parentWindow selectedValues = frm1. ID currentWindow.execScript Code:=”ShowFilterData(‘” + selectedValues + “‘)” End Sub Here, we have to get the parent window object of the web browser control using code line frm1.WebBrowser1.Document.parentWindow. We call “ShowFilterData” a parametric function, a jQuery method on our custom page using code line currentWindow.execScript Code:=”ShowFilterData(‘” + selectedValues + “‘)” As we have used Angular js so will be calling a method of our controller using the below code snippet from the method, which will be called from VBA //On Html page under Script tag <script type=”text/javascript”> function ShowFilterData (data) { angular. element(document.getElementById(‘ContId’)).scope().GetData(data); } </script> //On Angular controller js the GetData method is defined as $scope. GetData = function (data) { //Custom logic } On the html page under the script tag, we have defined the method ShowFilterData that will be called from VBA. An id should be assigned to the div where we have attached our controller using the ng-controller directive. So, here we have given the id: ContId as below GetData is a function that is defined under the controller.js. So, we have seen how we can easily call an angular js method using VBA. I hope you find it helpful! Supriya Khamesra Supriya is currently working as a Senior Consultant with Advaiya, and has more than 10 years of experience in applications development, and responsible for Enterprise Project Server and Project Online and BI solutions. Microsoft Certified Professional in EPM, Supriya has been working on multiple technologies/platforms like Azure, .Net,Office365,Project Server,Project Online,SSRS,CSOM,JSOM,Kendo UI,JavaScript, SQL Server among others.
Microsoft Dynamics 365 – Getting started with ribbon workbench

While using Dynamics 365 CRM, have you ever come across a situation where you are re-entering the same data on different locations or forms? If yes, then you are in the right place; let’s get your problem solved. In our blog, we will get you through the detailed process by which you can pass parameters & data from one form to another form using a single button click, with the addition of JavaScript on the ribbon workbench button’s command. Let’s look at a quick example of how you can do so. Follow the steps below: Open ribbon workbench. Select the solution that contains the source entity. 3. Add a new command of type Actions. Under that, add JavaScript Action. 4. Add a JavaScript action and call OpenDestinationEntityForm function, which is available in existing JS web resources 5.Now, add the ribbon button and fill in the necessary details. Also, link the command to the button. 6. Publish the changes. 7.Once done, navigate to the account and click the button to view the alert. Congratulations, you have successfully passed parameters from one form to another form using the ribbon button with customized ribbon workbench. If you are planning to customize Dynamics CRM or create custom workflows, business rules, plug-ins, ribbon workbench customization, etc., get in touch with our team of experts or add your comments below in case of query.
How to deal with SharePoint Online list threshold?

What are boundaries and limits in SharePoint lists? Every time you access a list or a document library, there is a search query raised at the backend which returns all the results. For example, if there are 20,000 or more items in a list or document library and when you try to display them– you won’t be able to see more than 5000 items on a page/view and the list/library will run into the 5000 item limit threshold. Boundaries are absolute limits that cannot be breached. This is a design choice by Microsoft. Limits are advised guidelines that have been agreed upon and are affected by system performance. These can go beyond the suggested Microsoft limits, but there could be consequences in the performance. The situation in which this sort of protection can be deemed necessary is, if one user wants to view all the 20,000 items in the list, it will result in the slowdown of the platform for everybody else using the platform. Mostly, these boundaries or limits apply to both SharePoint Online & SharePoint On-Premise users. But the difference is; in SharePoint Online, the list thresholds place cannot be changed whereas in SharePoint On-Premise, the thresholds can be altered explicitly, but it is not recommended as it might affect the server performance. How to manage the list? In custom pages as well as for various other reporting needs, when we access data from SharePoint lists using OData, there will be a threshold limit of 5000 records when the data is vast. In order to overcome this issue and manage the list effectively, we need to create indexed columns. Indexed columns and filtered views In the list/library settings, a user can create a maximum of 20 indexed columns. When we index a column, we are requesting SharePoint to make that column more available and executable in a query when compared to other columns. We should be aware that this prioritization comes at a cost, which is an increased overhead at the database level. Therefore, it is advisable to index columns that are going to be searched and filtered very frequently. The concept behind the filtering of views is to exclude irrelevant items within a list. To make a filtered view more relevant, it’s required for the first column within the list to be indexed to keep the returned rows or items below the view threshold. Steps to index columns: 1. Go to list/library settings. 2. Click on indexed column option. Multiple ways to overcome this challenge and retrieve the data from the list. OData: In OData queries, generally when querying lists items without filter, then it works fine irrespective of count of items in the list. But the problem arises when we need to add a filter in OData query based on some business requirements and when the item count in the list is more than 5000 records. It gives threshold limit error. To overcome this, we need to index columns used for filtering the results in the OData query. But, we also need to see that after filtering query over the indexed column, the result should not be greater than 5000 records else, we will have to implement more filters using indexed columns to keep count well below 5000 records. Indexing to be done in the same way as discussed above for the required columns. Example: /_api/web/lists/getbytitle(‘’)/items?$filter=Column1 eq 2 In the above OData query, Column1 must be indexed column if the total number of items in the list is greater than 5000 records. CSOM section – Read While working with CSOM for SharePoint list operations, the same threshold issue appears while trying to read all the items from a SharePoint list having more than 5000 records using CAML query. Here we will have to apply CAML query batch operations in order to achieve the same. Below is an example of reading a SharePoint List having more than 5000 records using CAML query. To summarize, large SharePoint lists can be managed and used effectively by indexing and below are the important points to note while indexing: We can have a maximum of 20 indexed columns per list. If we want to apply additional sorting in a view, the sort column is required to be an indexed column. Analyze the list items and check if some of them can be removed or archived to another SharePoint list. In case of any query regarding SharePoint Online list thresholds, add your comments below.
Five must-haves to evaluate cloud backup & disaster recovery providers

With the increasing amount of data processing and a wide variety of application usage, data protection has become a critical component in the IT strategy of every business. Even the slightest of downtime can significantly impact productivity. Therefore your daily operations can be compromised if you fail to recover your precious data quickly in the event of a natural disaster, storage failure, or cyber-attack. A vast majority of companies are aware of the importance of data protection. However, nearly 3 out of 4 companies worldwide are failing in terms of disaster readiness, according to the Disaster Recovery Preparedness Benchmark Survey, 2014. While cloud backup disaster recovery providers often offer affordable, reliable, and efficient data protection solution for companies, choosing the right service for your specific business needs can be a challenge. So, before you implement your backup and DR strategies in cloud, here are five things you need to take into consideration when evaluating cloud backup providers. 1. Experience – DR budget typically ranges from 5-7 percent of the total IT budget. Hence, finding a cloud backup service provider that has the capability to meet specific requirements of your organization and the one who has got the expertise to work within the available budget is important. 2. Testing Recovery– Testing recovery operations is an essential ingredient when selecting a service provider offering DRaaS/BaaS capabilities. It is recommended to perform an initial test at the provider’s location before you sign a contract with the provider to bring up any impending issues that might occur during the test. Varying infrastructural differences may confine you to work within the boundary of the provider’s cloud which may force you to redesign your infrastructure from scratch. 3. SLA – When considering SLA, you should be careful of even the minutest detail of service that your service provider is providing. Before signing up with any backup as a service (BaaS) or disaster recovery as a service (DRaaS), verify to read the fine print in the SLA. If something is not explicitly written into the SLA, you should not assume that it is a covered service. 4. Security – Some providers encrypt data at rest on their servers, but not data in transit. It is recommended that data should be encrypted at no less than 256-bit AES (Advanced Encryption Standard) levels before it leaves the customer's site for the cloud provider so that the data in transit cannot be stolen as plain text. The companies should hold all encryption keys and not allow the BaaS or DRaaS provider to have the ability to decrypt any data stored on their servers. 5. Compliance – As cloud-based storage today has become a commodity, hence, providers might tend to store your data on inexpensive storage having poor security. So, if your data backup pertains to government or industry related security regulations, you must ensure that the data centers where your backups are stored, meet the required compliance standards. For example, customers' health should be stored in HIPAA-compliant (Health Insurance Portability and Accountability Act) clouds and credit card data should be stored in PCI DSS-compliant (Payment Card Industry Data Security Standard) clouds. In short, SMBs who tend to do all the recovery themselves may miss some essential requirements. Hence, having a key person at the provider side should help eliminate those errors. However, it is highly recommended that you do your due diligence to ensure that the provider you choose for cloud backup disaster recovery services, can assist in the recovery of files in a right way while providing technical support and assistance as and when needed.
Sitecore 9.3 installation using Sitecore Install Assistant (SIA)

In November 2019, Sitecore released a new version 9.3.0, known as Sitecore Experience Platform (XP). This version majorly focuses on the feature upgrades. One such feature upgrade is SIA, also known as Sitecore Install Assistant. In the earlier versions, the developers, new to the Sitecore platform often faced difficulty during the Sitecore environment setup and installation in their local environment. But now, with the release of Sitecore v9.3, it has become easy to install and setup the Sitecore environment with the help of SIA. When we download the Sitecore 9.3, it comes equipped with SIA – a GUI, which asks the user to enter various parameters like name of the website, SQL Admin Credentials, SOLR configuration information, etc. After taking all this information, SIA starts the processing and sets up the Sitecore environment on the user’s machine. SIA guides the user throughout the installation process, starting from the reviewal of system requirements to the installation of pre-requisites. During the installation of Sitecore v9.3, we get an option where it asks us for the confirmation to install both the SOLR and the SXA module. Having this feature, the user does not need to worry about their installation afterward, as it used to be in previous versions. So, if you are planning to install Sitecore 9.3, here’s the complete step by step guidelines for its installation and setup with SIA: 1. Downloading the Sitecore v9.3 installation files: The Sitecore installation package can be downloaded from the Sitecore official developer site, using the link below (different packages are available on this link; the first two are having SIA with the respective package): https://dev.sitecore.net/Downloads/Sitecore_Experience_Platform/93/Sitecore_Experience_Platform_93_Initial_Release.aspx To set up a development environment, we use XP single package, if there’s no such need of using the XM Scaled up package. Now, download the required package (referencing graphical setup package for XP Single for this post) and unzip the package at the desired path. It will show folder content, as depicted below, with all 23 files. 2. Check for pre-requisites The Sitecore package contains an executable setup file for SIA. Run that in administrative mode and click Start. It will check for the .Net framework first. We can directly use SIA if .Net framework 4.7.2 is already installed. If it is not installed prior, then this has to be installed; it can be downloaded from here. A system restart will be needed after this installation and proceeding further. Other pre-requisites like Sitecore Install Framework (SIF) and Windows server requisites are installed automatically by SIA in the next step. We can also skip this installation if we have installed them already. 3. Configuration setup Open the setup .exe config file in notepad or any other editor. This configuration file contains SOLR, SQL, and site information. – SQL Server name – SQL Server admin credentials – Solr Windows Service name – Solr Service Port number – Solr server URL – Solr root path – Site Prefix, Suffix We can update the configuration file, or else we can specify these settings at the time of installation as well. If we have SOLR installed already or multiple versions of SOLR instances running on our system, then we can change the port number and provide other port in the configuration file at line number 14. The SQL Server settings can be specified in the configuration file. We can also change site prefix and their suffixes if we want. 4. Running installation: Continuing with the installation after pre-requisites check and installation of .Net framework 4.7.2 and SIF with Windows server requisites, now the next screen asks to provide SOLR configuration details. 4.1 SOLR installation SOLR port will be taken from the configuration file set up in the earlier steps. Other details like windows service name and installation path need to be mentioned here. – Windows service and path prefix is the name of windows service running for SOLR version 8.1.1. – Install path is the complete path for the SOLR version 8.1.1 installation. We can install SOLR from here, or if already installed, then this step can be skipped just by filling the required details. After the successful installation of SOLR, the setup shows a success message, and we can check the SOLR implementation by browsing the URL in the browser. http://localhost:8984/solr 4.2 Sitecore settings In the next step, setup asks for below Sitecore settings Sitecore site prefix Admin password to access Sitecore CMS License file, a path for the Sitecore license file to be used for this setup 4.3 SQL Server settings: The next step asks for SQL Server settings like MS SQL Server name, admin username, and password. If we are doing the installation using local SQL, then the predefined settings with the sa account will work; we just need to provide the password. Else if we are using any other SQL Server, then we need to provide the correct server name and admin credentials for the same. 4.4 SOLR configuration If the setup installs SOLR, then we need to check these configuration values, no need to change anything. In the case of a manual SOLR setup, we need to make necessary changes in the configuration and click on next. 4.5 Installing SXA The next step asks to check the Sitecore Experience Accelerator (SXA) which is an optional module. We can decide whether to install SXA with this installation or not. The next steps show the summary of the settings and configuration provided by us till now, before the actual installation starts. If we find anything wrong, then we can go back and correct. When we proceed with the installation, it also validates the details and finds out that all requirements are in place for the installation to proceed. Click Next to start the installation process We can check the installation status by checking the logs, clicking on the down arrow. If it shows “Everything is running smoothly,” it means we are good to go. After some time, the installation gets completed, and we can see the success message as below.
Key steps for configuring Azure app gateway for multi‑site setup

In a scenario where you want your website to be publicly accessible by all users, in that case, you cannot apply IP restriction to your website. Such Azure App Services (Web Apps) are publicly exposed to the Internet by default, accessible with their custom domain or *.azurewebsites.net URL, making it vulnerable to hackers and spammers, including Distributed Denial of Service (DDoS) attacks. An Application Gateway Web Application Firewall (WAF) tier (prevent mode) can help protect against Layer 7 (HTTP/HTTPS/WebSocket) attacks. Application Gateway is an HTTP load balancer that allows you to manage traffic to your web apps. As it operates at layer 7 (application layer), it can scan incoming requests using OWASP common vulnerabilities rule set. Many a time, it has observed that users face difficulty in configuring their apps services with Application Gateway, especially in a multi-site scenario or app services having multiple custom domains. How to configure application gateway with Azure app service In this article, I ‘will be providing some key points to consider when configuring the Application Gateway with Azure App Service multi-site scenario. App Gateway Configuration To prohibit the application gateway to reach your app service, ensure that Network Security Group (NSG) is not applied or blocking your Firewall Subnet. Backend Pool Configuration Ensure that you associate your app service in the target. To do so, select “App Services” as Target Type and choose your app service name from the Target bar drop-down. Note: If you have multiple custom domains in a single app service, one app service as a backend pool will suffice your need. In case you have multiple app services, each with a custom domain, then you need to associate each of your app services by choosing them from the Target drop-down. Configuring Custom Probe You can create a common health probe for all the websites by ensuring that you have checked the “Pick Hostname from backend HTTP settings” so that the probe is picking from “HTTP settings,” which further picks the hostname defined in the app service. That is how it identifies the custom domain rather than just the app service name. You may optionally use matching probe conditions with 200-399 HTTP status response code. Configuring Listener When creating a listener, create a multi-site listener. Ensure that you provide your custom domain of the app service in the Hostname field. Choose HTTP or HTTPS based on your website requirement. If your website requires HTTPS, then a PFX certificate for the website is required. HTTP settings configuration Create an HTTP setting. Then associate the custom probe and save the HTTP settings again. Make sure that “Use for App Service” is unchecked Note: You will need to upload .CER certificate if your website requires HTTPS. Next, ensure that you keep the “Pick hostname from backend address” box unchecked and provide the hostname of the website in the “Override hostname” field. This action is required when you have multiple custom domains on your website so that Application Gateway understands which custom domain it must pick. Note: The Request Timeout should be greater than the app service request time. This will allow enough time for the app service to respond before the request times out in the Application Gateway and will help prevent in 502 Server error. Configuring rule You can create a basic rule and then associate the Listener, Backend pool, and HTTP setting to it. Configure redirection (required in case of HTTPS) After you have created all the above settings, you will need to configure internal redirection if your website is accessible on HTTPS. For e.g., http://mywebsite.contoso.com will not be redirected automatically to https://mywebsite.contoso.com. To do so, you will need to create two things: 1.Create another multi-site listener on port 80. 2.Create another rule and associate it with the listener on port 80. More Interesting Read Azure Active Directory Authentication in .Net application How to convert FAQ into a Bot using Azure Bot Service DNS configuration To authenticate your app service, you must first add a CNAME record pointing to the app service URL. Example: mysite.azurewebsites.net Then bind your website with an SSL certificate. After you have configured the Application Gateway, Listeners, HTTP settings, and health probe look for the Backend Health status. It should be green. Then replace the CNAME record pointing to Application Gateway DNS instead of the app service. You can find the Application Gateway DNS URL from the Overview section in “Frontend Public IP Address.” Note: If there are multiple custom domains/websites, the configuration steps mentioned above need to be followed for each website/domain. Get the cloud services required to develop, manage, and deploy applications on a vast global network with the Microsoft Azure cloud computing platform Advaiya provide end-to-end services on AWS for cloud solutions. In case of any query, get in touch with our Azure experts or add your comments below.
CSOM authentication for SharePoint Online made easy

Scenario: We have a SQL stored procedure, which we want to schedule to run once in a month. Our database is on Azure, and we don’t have an Azure VM or SQL server, we only have an Azure database. So, to run this procedure automatically, we can’t create any SQL job. After some research, we decided to create a runbook and schedule the PowerShell script. Below were the steps we followed –Step 1. Create Automation Account – Login to https://portal.azure.com Go to All services then Automation AccountsClick Add. Fill all the details for the account and click Create. Now, you can see that the newly created account is added to the list. Step 2. Create a Runbook- Open this newly created account and go to Runbooks : Create a Runbook Select PowerShell as Runbook type and create the Runbook. It will take some time to create the Runbook. Step 3. PowerShell Script– When the Runbook got created paste below script in the Edit PowerShell Runbook $sqlSvrName=”DBServerName” $sqlDbName=”DatabaseName” $sqlUserName=”DatabaseUserName” $sqlPassword=”DatabaseUserPassword” $sqlStoredProcedure=”StoredProcedureName” $sqlConnection = New-Object System.Data.SqlClient.SqlConnection $sqlConnection.ConnectionString = “Server=$sqlSvrName;uid=$sqlUserName;pwd=$sqlPassword;Initial Catalog=$sqlDbName” $startDateTime = Get-Date $sqlConnection.Open() select $sqlConnection.ConnectionString $sqlCmd = New-Object System.Data.SqlClient.SqlCommand # specify that command is a stored procedure $sqlCmd.CommandType=[System.Data.CommandType]’StoredProcedure’ # specify the name of the stored procedure $sqlCmd.CommandText = $sqlStoredProcedure $sqlCmd.Connection = $sqlConnection #————- specify optional parameters to stored procedure #$startDateTime=$sqlCmd.Parameters.Add(“@rundatetime” , [System.Data.SqlDbType]::DateTime) $startDateTime = Get-Date $sqlCmd.ExecuteNonQuery() Step 4. Test the PowerShell Script– You can test the script by opening the test pane. Click the start button to start testing the script. After testing the script, publish it so you can schedule it. Step 5. Schedule the Runbook Go to Resources Go to Schedule, then Add a schedule. Go to Schedule then Create a new schedule then Fill the schedule details as per the requirements. After clicking OK, the newly created Schedule gets listed out in the schedule list, as shown below. Congratulations, you have just created a Schedule using Azure Automation to run a SQL stored procedure from PowerShell Script. Key Note: Other than PowerShell script, Azure Automation also supports several other types of Runbooks like Graphical (created and edited in graphical editor on Azure portal), Graphical PowerShell Workflow, PowerShell Workflow and Python. If you are planning to create a Schedule to run a SQL stored procedure from PowerShell Script using Azure Automation or any other such combinations, get in touch with our team of experts or add your comments below in case of query. Shruti Vyas Shruti is a technical consultant at Advaiya. With over nine years of experience in the IT industry, Shruti has a passion for data and an ability to understand and analyze it effectively. She shares her insight on various topics such as custom application development, project server customization, client-side scripting etc.
Accessing Azure AD Authenticated Web App in applications

Azure Active Directory (Azure AD) is a cloud-based identity and access management service provided by Microsoft, which allows users of an organization to sign in and access various resources such as Microsoft Office 365, the Azure portal, and many other SaaS applications. Nowadays, Azure AD has been used as a standard approach for authenticating various office365 based applications, allowing it to work with a user’s existing credentials. So, if you are developing applications that are deployed in the Azure environment, then the authentication mechanism is Azure AD instead of creating a custom authentication page and maintaining it Recently we were working on a project where we needed access to a WCF service deployed on Azure in a .Net application. This worked fine when accessed anonymously, but we faced numerous challenges while making it work with Azure Active Directory Authentication In this article, I will mention the steps one has to follow to carry out Azure Active Directory authentication on an application developed using .Net and consuming the WCF service or any app deployed on Azure. I will also be sharing the steps on how to access a secure Azure App using JavaScript For using Azure Authentication, your organization’s Active Directory should be synced with Azure. Let’s first see how to enable AD authentication on an Azure Web App Enabling Azure AD Authentication on Azure App. 1. Login to your azure account using portal.azure.com with your organization’s Office365 account. 2. Click on All resources present in the left pane and select the Azure web App service from the list where we need to enable Azure AD Authentication. 3. Click on Authentication/Authorization option from the settings section under the left pane. 4. At present, the Azure App is anonymous. To enable app authentication, click On. 5. You will see a screen as shown in the below picture 6. Select Log in with Azure Active Directory option from the Action to take when request is not authenticated dropdown. To carry out the configuration, click on Authentication Providers option below. 7. You will see a screen like this: 8. Select Express mode, click Create New AD App, give a name to the App, and then click on OK. 9. Save these configurations, and you will see that the status has changed to “Configured” which is visible under Authentication Providers. Now click on Save. 10. After creating the Azure AD App, we need to add below redirect URLs of Azure App for which the authentication is applied under Advance Setting and then Save. https:// WebAppName.azurewebsites.net https:// WebAppName.azurewebsites.net/ WCFServiceName.svc Here WebAppName is the name of your AzureApp created for service deployment. 11. Now, the Azure AD App is created. For further configuration, navigate to the Azure AD App from the left panel, under All services à Select Identity à Azure Active Directory a. Under App registrations, the AD App created will be visible b. Click on All applications. It will list all the azure apps. Select the App you created c. Copy the Application ID for the selected App that we will have to use later in the code d. Now to generate a secret key, click on Certificates & secret. e. Under Client Secrets, click on New Client secret and add a new one. Select Never Expires option and click on Add. f. Copy the value generated for future use as you will not be able to get this value later when you visit the page, and you will have to generate a new one. Note: If the key you generated contains any special characters (+, [], etc.) then generate a new key as it will not work. Now Select the Authentication under AD App h. Below screen will get displayed: i. Click on Add a platform and then select web applications. j. Add below Redirect URL: https://WebAppName.azurewebsites.net/WCFServiceName.svc k. Under Implicit grant select ID tokens Click on configure Now using Add URI add https://WebAppName.azurewebsites.net/.auth/login/aad/callback Here WebAppName is the name of your AzureApp created for service deployment. l. In case you need to access the service in a JavaScript file, then you have to add the URL of the web page where you will be referring to your custom JavaScript file with Azure Authentication (like https://PWAUrl/Project%20Detail%20Pages/test.aspx). This will act as a redirect URL after authentication. m. Thus, in all 3 URLs, two will be for web service, and one will be for JavaScript page redirection. 12. Get the TenantID and AzureAuthUrl. For this, click on Endpoints from App registration. Copy the OAuth 2.0 Token Endpoint (V1) Url which is the AzureAuthUrl https://login.microsoftonline.com//oauth2/token Now, let’s apply Azure authentication in the applications where the web service will be consumed Azure AD authentication in .Net applications For carrying out Azure AD authentication in .Net applications, we will need the below information which we had generated in the above steps: Parameter Variable of code to assign to OAuth2.0 Token Endpoint Url which is AzureAuthUrl (copied in step 12) authURL Application ID (copied in step 11 above) clientID Client Secret Key (generated and copied in step 11) clientCode WCF service URL deployed on Azure App resourceURL Copy these values in the code snippet, as described below. This code will carry out an HTTP web request using the auth URL with all parameters passed in POST and gives us access token, which will be added as Authorization header in the service call using .Net application. //Here is the code that will give you the Access Token which will be passed to the service call. string postParameters; string authURL = https://login.microsoftonline.com//oauth2/token?api-version1.0; string grant_type = “client_credentials”; string clientID = “ApplicationID”; string clientCode = “client secret key”; string resourceURL = “https%3A%2F%2Fwcfservice.azurewebsites.net%2FService.svc”; postParameters = “grant_type=” + grant_type + “&client_id=” + clientID + “&client_secret=” + clientCode + “&resource=” + resourceURL + “”; var dataToSend = Encoding.UTF8.GetBytes(parameters); var req = System.Net.HttpWebRequest.Create(authURL); req.ContentType = “application/x-www-form-urlencoded”; req.Method = “POST”; req.ContentLength = dataToSend.Length; req.GetRequestStream().Write(dataToSend, 0, dataToSend.Length); var response = req.GetResponse(); var readStream = new StreamReader(response, Encoding.UTF8); string data = readStream.ReadToEnd(); Newtonsoft.Json.Linq.JObject json = Newtonsoft.Json.Linq.JObject.Parse(result) If json.Count > 0 Then clienttoken
How to Update Multi-Select Task Level Custom Field Using CSOM

Requirement – In one of our recent projects on PMO Tool Implementation for a client from EPC vertical, it was required to create tasks with multiple resource assignment. The solution architecture consists of Kendo UI for the user interface, REST Service as middle layers, and Azure logic app for automating the REST Service call on schedule’s intervals. To create tasks with multiple resource assignment in the Project Plan, we passed the task details from interface and .Net based REST Service. The page looks as below. The user navigates to the page, adds new task item and on saving it gets created as a new item in the SharePoint list on that particular project site. At the end of the day, on scheduled time, the logic app triggers the WCF service method to add the newly created SharePoint list items as tasks in the project plan. Data types and fields Mapping – Problem – Here, we came across a situation where we had to update multi-select task level custom field. We tried doing this with multiple CSOM queries and logics, but we were not able to achieve it. Solution – We developed a CSOM logic code as below: This method can be followed to update the multi-select task level custom field. Author Recommended .Net Managed Client Side Object Model for Project Online How to Migrate SharePoint List Items to Project Online Need more guidance on SharePoint? Write to us to discuss how we can help you and your team become more efficient. In case of query, add your comments below.