Key Vault – Failed to sync the certificate.: The service does not have access to ‘*’ Key Vault

TLDR; How to fix Failed to sync the certificate.: The service does not have access to ‘*’ Key Vault

Hello! You may want to read this post if you have come across one of the following errors related to Key Vault:

  • Failed to update all the resources with the latest certificate
  • Failed to sync the certificate.: The service does not have access to ‘*’ Key Vault
  • Resource Microsoft.Web/certificates “[XXX]” failed with message { “Code”: “BadRequest”, “Message”: “The service does not have access to ‘/subscriptions/[subscription]/resourcegroups/[resource-group-name]/providers/microsoft.keyvault/vaults/[vault-name]’ Key Vault. Please make sure that you have granted necessary permissions to the service to perform the request operation.
  • Failed to add App Service certificate to the app, Check error for more details. Error Details: The service does not have access to ‘XXX’ Key Vault. Please make sure that you have granted necessary permissions to the service to perform the request operation.

add/sync certificate error

Background:

I came across a similar challenge recently.

We have an App Service Certificate purchased from Microsoft Azure Portal, and stored in a Key-Vault.

The same certificate was then imported and bound to many app services.

The App Services Certificate was configured to auto-renew and it was expected to be synced once it renews, but surprisingly – it just expired.

What Happened

When I investigated – I found that the App Service Certificate was renewed sometimes back before the expiry date, but it did not sync for some reason.

I, obviously, tried googling for solutions. Few of the links I explored:

I tried the suggestions but nothing worked for me. Then I decided to contact Microsoft Azure Support.

In the process, I found a potential solution. I appreciate they have initial trouble-shooting cards available on the support page.

Azure portal Trouble-Shooting page

Though I read it, I didn’t help. Then I realized that step #2 has some contents in markdown:

I tried fixing the markdown for appropriate message and I got this:

The two Service Principals above need to be granted with mentioned permissions.

These two service principals are default Resource Provider principals and their object-Ids are supposed to be common for everyone.

How to fix the permissions on the key-vault

  • Navigate to your key-vault
  • Click on “Access policies”
Navigate to Access policies
  • “Add Access Policy” if you do not have the two given service principals added already.
  • Assign the permissions for the two service principals as in the table below:
Service PrincipalSecret PermissionsCertificates
Microsoft Azure App ServiceGetGet
Microsoft.Azure.CertificateRegistrationGet,List,Set,DeleteGet,List

access policies
  • Save the access policy changes.
  • Navigate to App Service Certificate in question
  • click on “Rekey and Sync”
rekey and sync
  • click on “Sync”. Make sure your certificate is listed under “Linked Private Certificate”.
  • After the sync the existing certificate should be renewed.

Summary

To fix the app service certificate sync issues – you need to fix the permissions on the key vault for the two service principals and then Sync the certificate once. Please refer to the post above for the details of the service principals.

Thanks for reading this article on DevsDaily.com. 🙂

Error CS0234: The type or namespace name ‘HelloWorld.Core’ does not exist in the namespace ‘HelloWorld’

Warning MSB3245: Could not resolve this reference. Could not locate the assembly “HelloWorld.Core”. Check to make sure the assembly exists on disk. If this reference is required by your code, you may get compilation errors.

Error CS0234: The type or namespace name ‘Core’ does not exist in the namespace ‘Document360’ (are you missing an assembly reference?)

Are you seeing your Azure Pipelines build failing with similar error/warning message? It’s very likely that it’s a missing project dependency issue. It’s a usual case when you have a solution inside which you have multiple project and have added project-A as a dependency in project-B under the same solution.

How to fix Error CS0234: The type or namespace name ‘XXXX.Core’ does not exist in the namespace ‘XXXX’?

Check if you have added the project as dependency. It’s likely that you may have added the reference to the project’s DLL instead of the Project itself inside the solution.

While adding the reference to another project – always add the reference to the Project, and not the Project’s DLL by browsing for it.

While adding the reference you can find all the project for the current solution under “Projects” tab. Select the project that you want to refer. This way it will always build the referred Project first and then the current one (to resolve the dependency).

There’s another way of doing this. You can right click on the solution and select “Project Dependencies”. This window allows you to select the project and mark the projects it is dependent on.

Happy Learning! ?
DevsDaily | Azure DevOps

How to reset MySQL root password

It’s hard to memorize all the passwords now a days, especially when you set a very Strong Password with a combination of letters, special chars and numbers. My blog is hosted on WordPress. I lost all passwords stored in Google Chrome and so I lost it for the my WordPress admin account too. Since it’s a hosted WordPress on a Ubuntu Virtual Machine with MySQL, I didn’t have access to cPanel to reset the password. I had the only option to connect to the MySQL database and reset the password. But guess what! I forgot the password for MySQL as well. ?

Struggle of two hours on internet

I’m not an Ubuntu/Linux expert so I googled “How to reset MySQL root password on Ubuntu 18.04” and I found lots and lots of solutions. None of them worked for two hours!

My current MySQL version is 5.7.29. From the google search I found lots of the solutions which didn’t work for me and I later realized that they’re applicable on previous or other different versions MySQL.

I tried various combinations from the articles and was able to reset the MySQL root password successfully after a long hiccup.

How to reset MySQL root password on Ubuntu 18.04

You can verify the MySQL version using “mysql -V” command on the terminal. Here’s my output:
mysql -V : mysql Ver 14.14 Distrib 5.7.29, for Linux (x86_64) using EditLine wrapper

Steps to reset the root password

Note: you can skip the “sudo ” from the commands if you’ve logged in as a super user.

Step 1: Stop the MySQL Service

sudo /etc/init.d/mysql stop

Step 2: Confirm if the directory /var/run/mysqld exists and the current user has ownership

sudo mkdir /var/run/mysqld
sudo chown mysql /var/run/mysqld

Step 3: Start MySQL Daemon service with the –skip-grant-tables option. Notice the ampersand ‘&’ in the end, it’s required. After the command runs – press ENTER key to get back to the prompt.

sudo mysqld_safe --skip-grant-tables &

# You should see the following output:

[1] 19564
admin@blogserver:~$ 2020-03-18T21:15:59.872516Z mysqld_safe Logging to syslog.
2020-03-18T21:15:59.879527Z mysqld_safe Logging to '/var/log/mysql/error.log'.
2020-03-18T21:15:59.922502Z mysqld_safe Starting mysqld daemon with databases from /var/lib/mysql

Step 4: Log into MySQL without a password

sudo mysql --user=root mysql

# if you see the MySQL prompt("mysql>"), Voila!! You're into it now and can change the password for any user, including root.

Step 5: Run the following command on mysql prompt to update the password for the root user, flush privileges, and exit from the prompt.

mysql> UPDATE user SET authentication_string=PASSWORD('new-strong-password') WHERE user='root';
mysql> FLUSH PRIVILEGES;
mysql> EXIT;

Step 6: For safer side, kill all the running mysql processes and start it again, and try connecting to MySQL instance with new password.

# Kill all running MySQL processes
sudo killall -u mysql

# Start MySQL service
sudo /etc/init.d/mysql start

# Connect to the database using new password
sudo mysql -u root -p

# Enter new password and you should be able to see the welcome message from MySQL service.

You’re done! Your MySQL root password is successfully reset.

I really spent hours finding the right solution and hence I compiled this article to help people save time wandering on the various different articles on google. Hope it helps someone!

Happy Learning. ? Explore more article from DevsDaily!

How to run Console Applications on Azure Pipelines

Learn how to run console apps on Azure DevOps without hosting!

I came across multiple similar questions on StackOverflow for running a Console Application on Microsoft Azure. Answers mostly suggesting running the console app on WebJobs. Here I’m sharing another way of running console apps on Microsoft Azure DevOps, with Azure Pipelines.

[Azure]: Run your console app on Azure Pipelines

If you have a subscription to Azure DevOps, you will be able to run your console app. Upload your console app and create a pipeline to run a script. Add the command and necessary arguments if any and run the pipeline.

Note: At the time of writing this post I'm not sure of any downside of this approach, will update here if I find one.

What did I do?

In Azure DevOps, under one of demo Organizations I created a test project (“TestProject”), and initialized the empty repository with README.md file.

empty repository initialized with README.md file

Created a new C# Console App Project in Visual Studio 2019. “HelloWorld-Console”.

HelloWorld-Console app

Published the executable. While publishing, I changed the deployment mode to “Self-contained” and enabled “Produce single file” option. It produced 2 files in the publish folder:

  • HelloWorld-Console.exe, and
  • HelloWorld-Console.pdb

publish options
publish output

since .pdb file file contains debugging info, I had my concerns only with the executable (.exe) file.

I tried uploading the file in the Azure DevOps repository directly from the browser but the browser has a limitation of 20 MB max per file. So I chose to upload it using the git tools.

I cloned the repository on my machine, copied the executable (.exe) in the repository from the publish folder, staged the changes, committed and pushed it to remote repository.

Uploading the Console App to Azure DevOps repository

Finally, I setup a pipeline with a “Command Line Script” task to run the executable. Since it’s a Windows executable file, I set the Agent specification to “Windows-2019”.

In the “Script” block I just entered the file name of console app as it didn’t require any additional parameter to run.

Pipeline with “Command Line Script” task
Agent Specification set to “Windows-2019” for the pipeline

The pipeline ran just fine and I could see the output from the console app in the logs: ?

Command Line Task output in the pipeline log

In the pipeline, I extended the command to save the console app output to a text file. I added one more task to publish the output file in artifacts directory. I kept the artifact’s name as “pipeline-run-$(Build.BuildNumber)” in order to have different folder for every run, containing the output text file.

Extended the Command Line Script’s script to save the output in helloworld-output.txt

added “Publish Pipeline Artifact” task to publish the console app output file in Artifacts folder.

Now for every run of the pipeline, I could see an artifact produced, containing the output text file.

successful run with an artifact
Artifact
the output text file contains the expected “Hello World!”

So, with that I’m wrapping this post but you can always extend the pipeline with tasks to share the console app’s output to email, or other channels. This way you can run your console apps without an App Service/WebJob/Azure Functions. The Azure DevOps platform will run the console app for you for free I guess, until they start charging. ?

Further, you can extend the pipeline with scheduled trigger to run it on scheduled time as well.

? Did I miss anything or you have any feedback? Please help me fix it by dropping a line at sunny.ksharma@outlook.com, thanks!

Happy Learning!
/Sunny Sharma
devsdaily.com

How to fix window.open(url) opening two tabs

#javascript #tip:

How to fix window.open(url) opening two tabs


window.open(‘url’) may open double tabs when clicked in IE/FF/etc. Google Chrome is superior and is the first one to bring the latest updates from the W3C announcements, in my opinion! Browsers may behave differently for “window” functions.
Prefix “javascript:” for getting standard implementation output.

use javascript:window:open(‘url’) instead


ssh: connect to host x.x.x.x port 22: Connection timed out

Did you just change some network configuration on your Ubuntu virtual machine and now you’re not able to connect? Here’s how you can fix the ssh connection.

how to fix – ssh: connect to host x.x.x.x port 22: Connection timed out

Background

I spinned up an Ubuntu (18.04.3 LTS, Bionic Beaver) virtual machine on Azure for an experiment. I was setting up apache2 server (for a web server) and in the process I made some changes in network configuration (I literally allowed IPv4 & IPv6 in the firewall, that’s all). While making those changes I got a clear message on the screen that the current network may be unstable but it continued to work flawlessly until the connection was closed. The very next day and I wasn’t able to connect to the Ubuntu VM using ssh on the same IP/Port. Though the web server was responding well at port 80.

What I tried first?

I had forgotten the warning message I received earlier, about the network instability, while making the changes in network configuration and now I had this. So, at first I jumped on to Azure Portal and I tried restarting the virtual machine, it didn’t work. I removed and re-added the allow port 22 under the Networking tab, it also didn’t work.

How to fix “ssh: connection to host X.X.X.X on port:22”?

So, I did some digging on the internet and I found two lines of script required to fix the connection. It’s a firewall issue, it gets updated while changing the network configuration. Thankfully, Azure Portal offers a console to run shell scripts from the browser itself.

You can find the “RunShellScript” option under Operations > Run Command on the virtual machine’s setting blade. I believe all the cloud providers offer similar connectivity for the VMs using which you can connect to the VM using ssh or RDP for recovery purposes. Please refer to the snapshot below

RunShellScript from Azure portal

After I ran the two above lines, I tried to connect the virtual machine on the ssh and the connection was successful.

Edit: I later discovered that the commands in the browser console above can be run without sudo prefix.

You can later verify the status of the firewall using “sudo ufw status” on the same window.

Microsoft Azure offers a wide range of tools to troubleshoot network errors, check out the documentation for more details

This resolution may apply for other Linux OS as well but I haven’t verified. You may want to read more article Microsoft Azure on DevsDaily.

I welcome any suggestions or feedback to this post, Happy learning!

HTTP Error 500.0 – ANCM In-Process Handler Load Failure

With ASP.NET Core 2.2 support added to Azure Web Apps – it’s a bit tricky to host multiple applications using Virtual Directories. If you just publish your web apps using Visual Studio on different virtual, directories – not more than one will work at the time I’m writing this.

You need to make certain changes in order to get this working.

Let’s try publishing some asp.net core apps on Azure Web App in under virtual directories and see how it looks like. I’ve set up a Web App on Azure for this purpose with below configuration on Azure portal:

I’ve created an Empty ASP.NET Core Web App (Ver 2.2). My plan is to change the text and publish apps separately on the root directory, and on two different virtual directories, making three apps in total on one azure web app.

Startup.cs

My plan is to the default “Hello World” text to “App1 on Virtual Directory” and “App2 on Virtual Directory” subsequently and publish each on a different virtual directory. I will have to map two folders as virtual directories in order to get them working which can be done either before or after publishing the applications.

Downloading the Publish Profile helps quick publishing of azure web apps.

I added two virtual apps to the main apps at “/app1” and “/app2” virtual paths. So we have three apps now. Next I will publish the apps on the two virtual directories changing the default text.

I published the default application in Root Directory and it started saying “Hello World”, then I published the same app with changed text on App1 virtual directory but it failed to start with the following error, Same response for App2 on the next virtual directory. :

HTTP Error 500.0 – ANCM In-Process Handler Load Failure

Cause:
ASP.NET Core 2.2 and later has an In Process hosting model on IIS (set as default). In this model the app is hosted directly inside of an IIS Application pool (IISHttpServer Web Server), and it doesn’t proxy to an external dotnet.exe (instance running .NET Core native Kestrel Web Server).

.csproj file is where you can see the hosting model

Currently, only one app can run in one app pool (one azure app).

Solution:

Host the apps as OutOfProcess model

In this case – all the apps are required to be hosted as OutOfProcess model. When you publish the application it generates the web.config file with all the set configuration.

Open the .csproj file and under Project > PropertyGroup > AspNetCoreHostingModel, change the value “InProcess” to “OutOfProcess”. Be careful with the casing here!

change “InProcess” to “OutOfProcess” in .csproj file

Note that all the apps has to be deployed in “OutOfProcess” model, otherwise no application (inside the given azure app) will run. This change has to be applied in all the projects you’re publishing under one azure app.

My apps start working as I change the hosting model from InProcess to OutOfProcess and publish them.

This way it enables us to host multiple applications inside a single Azure Web App.

Note, that OutOfProcess drastically reduces the throughput as it proxies the request between Kestrel Web Server running dotnet.exe and IIS Web Server.

Azure DevOps

Azure DevOps is a suite of services for Managing, collaborating, building and deploying code.

Azure DevOps Organizations are the starting point for Azure DevOps. You need a Microsoft account to sign-in. If you do not have one – sign up for a free account. Organizations are free up to 5 team members. An Organization can hold many projects.

Sign in and create your first Organization and add a project.

As you sign-in, it will create an Organization by default based on your login email. If you do not like it, feel free to change the Organization name by navigating to Organization Settings -> Overview. You can either keep the existing Url with the new organization name or change it along as well.

Once created, you will be navigated to your organization. The Url would be be like <your-organization-name>.visualstudio.com.


Here’s the first screen inside a newly created organization:

For the demo purposes – I’ve added a “HelloWorld-Project” project inside “helloworld-demo” Organization.

Note: the screens are subject to change hence by the time you read this blog, it is possible that the option has changed it’s place.

You can set project visibility to be public or private, version control (Git or TFVC) and the Work Item Process template (Agile/CMMI or SCRUM) to manage the tasks across the project board. I will go with Git version control and Agile work item process.

You can manage teams and add members to it under Project Settings at the bottom in the left.

Azure DevOps offers a group of services facilitating everything required to manage, build, and deploy a project from start to finish. Following services are available under the Azure DevOps suite:

  • Azure Boards, for work management
  • Azure Repos, for social collaboration
  • Azure pipelines, for building and releasing the project
  • Azure Test Plans, for Manual Testing and Load Testing
  • Azure Artifacts, for sharing different packages

Watch this space for detailed articles on the Azure DevOps services in my upcoming blogs.

Thanks for reading 🙂

Do you think this post can be improved? Please leave a comment.

Office 365 Export | Perfect Tool to Backup Office 365 Mailbox to PST

[Guest Post]

Many times, admin as well as individuals wants to save their crucial data from cloud to a local system. Well, there are several reasons due to which users want to switch to the desktop email application. Thus, Outlook is one of the most commonly used email programs with a great number of benefits. Therefore, everyone wants to find out a solution to convert Office 365 mailboxes to PST format. Now, we have come up with Office 365 Export software, which helps to backup Office 365 mailbox in PST format without any hassle.

Overview: Office 365 Export Tool

Office 365 Export is a top-notch software, which allows the user to backup Office 365 mailboxes in Outlook PST. It is capable enough to extract all items of Office 365 like mails, contacts, tasks, calendars, etc., in a seamless manner. With its self-explanatory user interface, one can easily export Office 365 archive mailbox to PST file format without using eDiscovery and PowerShell.

Impressive Features of Office 365 Export Tool

    • Export Emails from Office 365 Mailbox
      The Office 365 PST Export tool is designed to move Office 365 data to another user account in the local machine. Moreover, it allows to convert Office 365 mailbox items email, calendars, contacts, journals, etc., to PST file. Make sure that Office365 admin account should be impersonated before converting data from Office 365 Mailbox.
    • Move Archive Mailbox of MS Office 365
      With this application, users can export Office 365 archive mailbox or In-Place archive mails in different formats (PST / EML / MSG). One can easily retrieve emails according to a specific date range using Date filter option. Users need to select a particular date range using both “To” and “From” fields while extracting O365 Archive Mailbox or In-Place Archive mails.
    • Migrate O365 Data in Different Formats
      The Office 365 Exporter tool facilitates a user to backup Office 365 mailboxes and recovers them in different formats, i.e., PST, MSG, and EML. It permits the user to get individual O365 emails along with attachments. However, Outlook installation is mandatory for the complete working of Office 365 export tool.
    • Option to Transfer Only Specific Data
      Office 365 Export tool provides an option to fetch only limited data instead of extracting complete O365 mailbox. Users have to choose the checkbox located next to the desired mailbox or other components in order to migrate PST from O365 mailbox. This feature will not only save time but, prohibits irrelevant extraction when users try to move email from Office 365.
    • Apply Date-based Filter Option
      Office 365 Mailbox Export tool allows the user to transfer Office 365 mails before regaining the entire data from Exchange Online account. Additionally, this software equips an option to select specific data lying within a certain date range when one can backup Office 365 mailbox to PST file. Users have to enter “To” and “From” field to filter data in a quick way. This feature will help to avoid unnecessary data while exporting Office 365 mailboxes.
    • Facilitates Pause & Resume Option
      The data extraction from Office 365 will be stopped as many times as required. To continue this operation, users have to click on the Resume button. There is no need to start whole process to backup Office 365 emails to PST from beginning all the time it is paused. This is a unique feature in which internet bandwidth is mandatory for other tasks too. Further, no alteration may be made in email formatting.
    • Provide File Naming Conventions
      Using Office 365 Export Tool, a user can save all components exported from Exchange Online in different file formats. Several naming convention is available when users try to backup O365 emails to EML format like Subject, AutoIncrement, Subject_Date, Subject_From, Date_From_Subject, Date_Subject_From and some other combinations.
    • Create Migration Analysis Report
      The software generates a complete status report during throughout the migration process. A final report is generated at the end of extraction includes necessary details, i.e., user ID, calendar count, etc. Although, in case of Archive mailbox extraction, the report contains folder count, folder path, export count and fail count. Also, the progress tick is available at the end, which conveys the successful completion of process to backup Office 365 mailboxes to PST format.

Office 365 Export Process

Pros

      • Tool is compatible with Windows 10 and all below versions.
      • Extract data from Office365 mailbox in a very short time.

Cons

      • MS Outlook installation is needed.
      • Does not support Mac operating system.

Final Verdict

After knowing above pros and cons of the product, we can wind up with the fact that Office 365 Export is the best-suited tool. Also, it has a simple, easy-to-use and interactive interface. Thus, it can be rated as 9.8/10 because it requires MS Outlook installation and configuration for its working. Besides this, there is no more negative point so, I can proudly say that there cannot be any other application like this.

How to Convert a SQL Table to C# Class

Run the following SQL query and it will print the SQL table in a formatted C# Class:

———————————————————–

DECLARE @tbl sysname = ‘doc360_User_Role’

DECLARE @ClassStr varchar(MAX) = ‘public class ‘ + @tbl + ‘
{‘

SELECT @ClassStr = @ClassStr + ‘
public ‘ + ColumnType + NullableSign + ‘ ‘ + ColumnName + ‘ { get; set; }

FROM
( SELECT replace(systemColumns.name, ‘ ‘, ‘_’) ColumnName,
column_id ColumnId,
CASE systemTypes.name
WHEN ‘bigint’ THEN ‘long’
WHEN ‘binary’ THEN ‘byte[]’
WHEN ‘bit’ THEN ‘bool’
WHEN ‘char’ THEN ‘string’
WHEN ‘date’ THEN ‘DateTime’
WHEN ‘datetime’ THEN ‘DateTime’
WHEN ‘datetime2’ THEN ‘DateTime’
WHEN ‘datetimeoffset’ THEN ‘DateTimeOffset’
WHEN ‘decimal’ THEN ‘decimal’
WHEN ‘float’ THEN ‘float’
WHEN ‘image’ THEN ‘byte[]’
WHEN ‘int’ THEN ‘int’
WHEN ‘money’ THEN ‘decimal’
WHEN ‘nchar’ THEN ‘string’
WHEN ‘ntext’ THEN ‘string’
WHEN ‘numeric’ THEN ‘decimal’
WHEN ‘nvarchar’ THEN ‘string’
WHEN ‘real’ THEN ‘double’
WHEN ‘smalldatetime’ THEN ‘DateTime’
WHEN ‘smallint’ THEN ‘short’
WHEN ‘smallmoney’ THEN ‘decimal’
WHEN ‘text’ THEN ‘string’
WHEN ‘time’ THEN ‘TimeSpan’
WHEN ‘timestamp’ THEN ‘DateTime’
WHEN ‘tinyint’ THEN ‘byte’
WHEN ‘uniqueidentifier’ THEN ‘Guid’
WHEN ‘varbinary’ THEN ‘byte[]’
WHEN ‘varchar’ THEN ‘string’
ELSE ‘UNKNOWN_’ + systemTypes.name
END ColumnType,
CASE
WHEN systemColumns.is_nullable = 1
AND systemTypes.name IN (‘bigint’,
‘bit’,
‘date’,
‘datetime’,
‘datetime2’,
‘datetimeoffset’,
‘decimal’,
‘float’,
‘int’,
‘money’,
‘numeric’,
‘real’,
‘smalldatetime’,
‘smallint’,
‘smallmoney’,
‘time’,
‘tinyint’,
‘uniqueidentifier’) THEN ‘?’
ELSE ”
END NullableSign

FROM sys.columns systemColumns
JOIN sys.types systemTypes ON systemColumns.system_type_id = systemTypes.system_type_id
AND systemColumns.user_type_id = systemTypes.user_type_id
WHERE object_id = object_id(@tbl) ) t
ORDER BY ColumnId

SET @ClassStr = @ClassStr + ‘
}’

print @ClassStr

———————————————————–