About Matthijs Hoekstra

Posts by Matthijs Hoekstra:

How to integrate the Microsoft Identity Platform (AAD or B2C) with custom JWT authentication for Realm Cloud in .NET

Wow that’s the longest blogpost title I have ever used. Hopefully this will help finding this page if you are searching for a solution I am describing here.

One of our customers came to us with a question how to integrate our B2C product with Realm Cloud. I had looked at this product before but didn’t know what technically was possible for integration with B2C.

The request was to use B2C tokens with the custom JWT Authentication Realm cloud supports. They support several ways to authenticate users to sync the offline database back to the cloud. By default they switched on Nickname authentication which is very easy to get started with (just pass a username and that user is logged in). For production use they recommend JWT Authentication.

And here comes the issue with B2C (or AAD) integration. Realm requires you to upload a public key to their server which can help proof the tokens are signed by you (with your private key). The customer was asking if they could send the B2C tokens to Realm backend directly since the users were already logging in to the apps with B2C. It’s fairly easy to download the public key from the well-known openid endpoint. There is a list of public keys you can find. The problem is that our private keys might rotate and/or replaced with new ones. This is something Realm doesn’t support. You have to upload 1 public key and that’s it.

Besides that you also have to problem that anybody can sign up for a B2C tenant and create valid JWTs signed by B2C with the same key. The JWT Settings in Realm Cloud do allow you to provide a set of required attributes and you should configure the audience with the clientid of your app to make sure only your tokens are accepted. But again, this will fail if we start rotating our keys (which might never happen)


So the solution is to create your own tokens, sign them with your own private key, upload your public key to their server and start sending your signed JWTs to login users.

These are the steps I did to make that work in .NET:

First you need to create a public/private keypair. The instructions are on the site of Realm, but I ran into an issue with the .NET code I wrote which requires P12 (pkcs12) format files. So I had to change the commands a little bit:

First line I entered to create an RSA token and a cert (which I need later)

Fill in the requested information (not important what you fill in here)

Export the public key with this command:

And create the certificate file (with private key), remember the password you entered:

The following code will read the pfx (I renamed it to p12 file), creates a new JWT and signs it. This JWT is used to signin the user.

Read the cert (I am not sure if the last parameter has to be set like this Smile)

setup the infrastructure to be able to sign the JWT

Create the payload for the JWT. This one is really simpel. It only needs the userId. I have it hardcoded here but here is where you can integrate B2C or AAD by using the OID or SUB claim from the ID token you got back from the B2C or AAD endpoint. Don’t use the email address since that might change (or the name). OID is always unique per person even over different applications so it’s the best claim to use for this scenario.

And create the access token:

Now you can do your Realm stuff. Typically you would check if you already have a logged in user (the SDK does that for you) but if the current user is null you log the user in with the following lines (btw, the documentation of Realm has a bug where passing null as the last parameter results in an exception. So I changed the highlighted part)

That’s it. Your user will show up in the Realm Studio as JWT user.


The BIG and IMPORTANT caveat here is, you need to figure out how to protect your private key, that’s something you don’t want others to get access too since they can start generating their own tokens with that.

Perhaps using B2C Identity Experience Framework policies to do  REST call to your server to sign a custom JWT which you pass as claim back to your client and use that to sign in might be a nice work around for this. I leave that exercise to the reader, please let me know if it works 🙂

Mission accomplished.

Create a user delegated permission and an application permission with the same name in Azure Active Directory

For a training we are delivering I tried to create a little sample where I show how to create an API and protect it with our Microsoft Identity Platform. We have 2 kind of permissions we can support with our consent and permissions framework. User delegated permissions and application permissions. This is what we use for MS Graph as well.

User delegated permissions are used if you want to grant the app running the permissions in name of the user. For example I want the app to be able to read the users email. In our portal that’s very easy to setup in the application blade and select expose API.

The second type of permissions are called application permissions. These permissions are used for daemon apps for example. Applications which don’t have user interaction directly. At this moment you cannot create these through the UI so you have to modify the manifest.

What I didn’t know until this week is how to create an app permission with the same name as the user delegated permission. For example the Catalog.View.All permissions is something I want to expose so a daemon app could call that API as well. Application permissions are created by creating roles in the manifest. It’s almost the same as user roles but with the little change that the allowedMemberTypes is Application instead of User.

The one trick you have to do if you need the app permission to be the same as the user delegated permission is the id, displayname and description have to be exactly the same. So if you look in the manifest at the user delegated permission:

The adminConsentDisplay name needs to be the same as the displayName and the adminConsentDescription needs to be the same as description. The id needs to be the same and isEnabled needs to be the same to.

If you don’t do this and the value is the same you will get an error when trying to save the manifest.

If I now go to my daemon app and request API permission they will show up in both types of permissions. First is the Application permissions screen:

Second is the Delegated permissions screen:

We really should have a UI to be able to this so you don’t have to do this by hand and make the mistakes I made to get this to work. The team is working on this, I just don’t know what the exact timeline is.

Let me know if this was useful for you and if you are using this to protect your own APIs with our identity platform.

Little update about my job after 8 months

End of June our fiscal year ended. After a lot of travel this month I finally had some time to spend time with my family. My mom is visiting and was able to watch my daughter Lisa so my wife could join me in Washington, where I was for Identiverse and later travel to visit friends near New York. June was the heaviest travel month for me so far. I spend 2 nights at home. But this weekend I spend time away from home WITH family and enjoyed a nice time at the water in Bremerton. That also gave me some time to reflect and look back at my new job so far.


To summarizes my job which I started in October 2018, tons of travel! Before I joined this team, I had a year I didn’t travel at all and since I started this new role I have been around the world. I have seen many different places and met a ton of new people. I learned a ton of new technology and visited many conferences. Time really has flew by since I started.

The video above is build with the mobile app ‘app in the air’. It reads all my trip-it information (the app I use to organize my travel) and creates a nice little video. As you can see I have sit in a plane a lot.

Since I started the job I flew 128257 real miles, sat in the plane for 286 hours for 54 flights. If you look at the trip-it stats I have traveled 108 days for 12 trips, visited 16 countries and 33 cities. This resulted in being Delta Diamond for the first time in my life (125.000 qualifying miles needed, I have 141.504 so far this year alone). Got me to Platinum level at the Marriott, spent plenty of nights in other brand hotels as well.

To keep my daughter involved in all the time away from home we bought a world map and we set pins on the places I still need to go and where I am at the moment (golden pin). I also send postcards of all the places I travel (tip from Colene). So far Lisa received 15 post cards, Milan and Johannesburg cards never arrived). Some cards take 5 weeks to arrive, while others take a week.


I started most of the work for Ignite the tour where we had to present on our Identity platform and I had to man the Azure Active Directory booth. One thing I learned; booth duty is a enormous good way for ramp up. I would recommend any new hire to man the booth for a couple of days. You might not know any answers when you start but that forces you to figure out the answers and it’s great for your internal network. It also forced me to understand more than just the developer platform.


As my job describes; I presented at a lot of different conferences across the world. Part of the job is trying to get into the door of other non-Microsoft conferences. You need to build a bit of a name of yourself before you get selected and invited for conferences. Fortunately I still know some people who were generous enough to offer a speaking slot at their conferences. I also delivered a ton of different developer trainings around the world. I was fortunate enough to start this job with the help of my colleague Kyle Marsh. So it was an easier start because I was able to ask a ton of questions. Besides Kyle there are a ton of other folks I started to get to know who can help me with my endless list of questions. We are still figuring things out together. The interesting part of giving developer training is you really need to understand and know how things work. I still run into things which I think are not logical or hard to explain to developers. Most developers we train are not familiar with modern authentication and authorization. Terms like auth2 and OIDC are completely new to them. We try to explain the new way to integrate with Azure Active Directory in a way they don’t really need to understand how those protocols work.

A few conferences stood out to me:


This conference was held in June in Washington DC. Everybody who is anybody in the identity space is at this conference. It felt like a small family. Interesting content but more so, very interesting people. You realize these folks are the people who invented a lot of things which makes the internet as we know it more secure. It was also very clear Microsoft is one of the leaders in this space. My colleague Libby demoed our FIDO2 integration with our platform and that got a huge applause from the audience (and the folks in the audience really understand the importance of this)

Techorama Belgium

I finally got the chance to present and attend Techorama in Belgium (1700 attendees). Together with my colleague Kyle Marsh we delivered a paid pre-conf 1 day developer workshop. And I presented a session at the conference. This conference was very well organized and it was great to see a lot of familiar faces and catch up. Fortunately I am presenting at Techorama in NL in October as well.

NDC Oslo

This was one of the best organized events I have ever been. Especially the food the entire day was smart, no huge lines during lunch rush hour. Also a ton of familiar faces and tons of very well known speakers. I hope I can get on stage for this conference in the future. I attended a workshop from Brock Allen on ASP.Net middleware and Identity Server. One of the better trainings I ever attended and got me a ton of knowledge on our own platform as well. I returned home with a lot of questions on how and why we implemented certain features in Azure Active Directory Smile


What I like most about this job is meeting new and familiar people. I love working with (enterprise) developers. And being on the road again helps me meet so many of you. I learn a ton. As part of my job is not only be the developer voice of our Identity organization, it’s also bringing back feedback and insights. Every time I talk to a developer I learn something new (or get confirmation off something we already knew).

Coming time

Part of the job is also a ton of customer/ISV meetings and calls to talk and help through different architectural discussions. How do you do X, how do I add external identities. What’s the best way to developer multi-tenant solitions etc. We also support our internal teams at Microsoft. Still cool if you have a call with some developers from Minecraft and you are able to come up with an architecture they need to implement a certain requirement.

The coming time I am focusing on creating more developer training content. We are scaling up our efforts to also train more field people (MS colleagues who also need to talk about security with our customers) on our developer content. I plan to submit to more conferences to try to get a speaking slot. We will create more developer content in a box which can be used by field and MVPs to redeliver the training we have been delivering all over the world. Although the content is still changing we think we are currently in a fairly good spot.

I also want to create a few blog posts with little nuggets of information and things I learned. I hoped to do that more during my learning process but to be honest. I have been very very busy to ramp up and deliver the content all over the world I didn’t find time to do that.

1 thing I didn’t expect with all the travel is how much tired I would be. When traveling you think, I have so much time in the plane. When I am at the location I have so much time at night since I am not at home, but most of the time I am just tired, jet-lagged, hungry. Tons of preparations to do for the trainings and presentations. The work from Redmond with all the calls and customer calls continue when you are traveling too. So you make tons of ours and just a few hours of sleep a night before heading back to home and try to have a social and family live and perhaps spend some time continue the remodel which is not finished yet Smile.

We signed up for 20 cities for Ignite the tour this year (Tokyo, Singapore are new cities for me). We divided it with the 2 of us. So hopefully we can hire new people to join us for this tour to lessen the burden on travel time a bit. On the other hand, this gives me the opportunity to travel to Australia again for example and visit my buddy Roel. There are absolutely benefits of travelling the world.

So far it had been a great experience, I learned a ton. Sandra and Lisa have been great supporters. Fortunately we can hire 2 more persons in the team which should help cut back some of the travel which has been a bit crazy.

Configure Domain_hint in asp.net core

This took me way to much time to figure out since there is a ton of old information on the internet. I wanted to change the default behavior when people are logging in to my ASP.NET Core website using Azure Active Directory (or Microsoft Identity Platform). After some searching I figured out how to change this setting.

You have to add the following piece of code to the ConfigureService method in your Startup.cs

Same trick works for Login_Hint

Hope this saves me some time next time I am looking for this information.

Switching to Google Fi

Last week I switched all the mobile lines of my family to Google Fi. We had t-mobile for some time but I wanted to try and see how Google Fi works.

Since I am going to travel a bit for work, I was looking for a new phone which could work at least a working day without charging and gives me great coverage. I also wanted a plan with works great when abroad. t-mobile already has excellent coverage world wide and free text and data, but the speed is limited. For $5 a day you can get to regular speeds (1GB for the day). Google Fi has international data included in their plan, so that sounded interesting. In the past when I checked them out, it only worked with a few select phones. Recently they added the possibility to use any Android phone and even iPhones work today. All it takes is installing the Google Fi app and you’re off to go (with your own phone you can test it free for a month, you can always port your number later if you want to).

What’s unique about Google Fi is that it uses 3 operators in the US and picks the strongest one or one of 2million+ Wi-Fi hotspots) and switches for you automatically to give you the best connection. (Sprint, T-Mobile, U.S. Cellular) You can check out the coverage map here. It can also protect your connection by automatically using a VPN (yes, you have to trust Google, but you already are since you use Android Smile)

So I ordered a Pixel 2XL since that gave me a $300 credit on Google Fi. This made a bit cheaper than the newer Pixel 3XL. (I love the phone, battery life is excellent and so is the speed, can’t wait to test it internationally)

The Pixel 2XL has an eSIM, that means you don’t have to put in a separate SIM card (you can if you want). You download the Google Fi app and you active your line through the app. Porting the number was done in less than 2 minutes. (you need your number and the pin-code you setup with t-mobile).

I signed up my wife and daughter too, few day later the SIM card arrived in the mail and I popped them in their phones, started the Google Fi app and transferred their numbers. All set and good to go.

On the website or in the app you can see more details about your usage. Everybody can see in depth their own usage and which app uses how much (that’s something, I as a plan owner, cannot see, I can only see the total usage per person)


Adding a data SIM was easy too. You order one for free on the website. You navigate to fi.google.com/data and enter the code on the card which holds the SIM, the gmail account you are logged in with determines to what person the data SIM is attached. Activating it was easy. I had to add an APN to my extra phone manually (h2g2) but after that it just worked.

So all in all I am quite happy. Simple model with voice and text, data bundle is easy too. Doesn’t matter if you are in the US or abroad and if you are tethering or not. Also easy to get data SIMs if you have devices which need to be online (I can’t wait for all my laptops to have a SIM slot)

So how does t-mobile compare to Google Fi?

My monthly plan with t-mobile was $126 per month total, this was for 3 lines (I somehow got a free 3rd line in the past). This interesting things included:

  • Unlimited talk, text and data (2GB-22GB) hotspot amount is limited
  • Streaming like Netflix doesn’t eat away from your data bundle. But by default it’s optimized for DVD-quality (480p)
  • In-flight texting on all Gogo-enabled flights

There is a bunch of extra things you can take, voicemail transcriptions etc. But you have to pay extra for those. Since I started my service with t-mobile they introduced new plans where you only pay $100 including taxes and you get a free Netflix subscription as well. So not bad at all.

But when I looked at my usage from my family I saw we only use around 3GB per month total. So let’s look how what Google Fi charges.

The first line is $20 per month (plus Taxes and fees so add another $5). This gives you unlimited talk, text. You have to pay for data! BUT only for the first 6GB, it’s $10 per GB so you never pay more than $60 for the data, after that it’s free. They call it bill protection.

Every extra line costs $15 per month. So for 3 lines I pay $50 plus taxes and fees. The bill protection with 3 lines kicks in at 12GB(!) so this would costs me more than with t-mobile, but my average for the family is only 3GB.

T-mobile charges $20 for an extra line for tablets and $10 for smartwatches. Google Fi doesn’t charge anything for a data SIM, you can order as many as you want, they just eat into the same data from your plan. This I like a lot.

What I also like about Google-Fi is you can use tethering on your phone as well, again the same data from your plan. This also goes for anything you do internationally. It’s all just the same data bundle.

So give it a try. You can use my link https://g.co/fi/r/56XFYR this gives you $20 credit (and I get some too Smile)

What to pack for business travel?

For my new job, I need to travel a lot again. So instead of giving tips on how to fold your underwear so you can travel 3 weeks with only carry-on, I will share some of the stuff I take with me during travel.


Since I will be delivering presentations, demo’s and give training I travel with at least 2 laptops. In case 1 stops working, but also to have 1 ready to download stuff you might need to recover the other device in case you get a corrupt OS or something like that.

For this trip to Sydney and Berlin, I will pack 2 Windows machines. I might bring a Mac as the 2nd machine instead the next time, but for this trip that won’t be needed. So I’ll bring my Surface Laptop (all-time favorite) and as a backup the Surface book (1). 2 power adapters so I can charge them both at the same time.


I have 2 external drives with presentations, demos, and other stuff I need to use to help prep myself.


it’s the Samsung T5 250Gb SSD since they are super fast USB-C SSD drives. I had these for some time, The bigger ones are very affordable too. Very useful if you need to copy virtual machines, ISO files etc.

They also have a copy of Win10 and an offline install of Visual Studio 2017 and VSCode. The offline version of VS2017 is important since a regular install will download tons of stuff (like the Android emulators) from the internet and that’s no fun if you are stuck with crappy hotel Wi-Fi.

I’ve set up all my accounts with 2-factor auth. If it happens you don’t have cellular reception or Wi-Fi access for your phone that might be an issue. So I also set the accounts up to accept the codes the MS Authenticator app gives you. An added benefit, you can log in in your sites (like the Azure portal) from your laptop on the plane, where you don’t have phone reception. I also bring my Yubikey to be able to access my accounts. I bought a very cheap FIDO2 compliant one, also to be able to demonstrate some of our AAD integration in the future.


To be able to hook up my laptop on stage to a cable which provides internet (always try to get a wired connection, never trust Wi-Fi at conferences with tons of people in the rooms using that precious bandwidth you so desperately need while presenting) I use a USB3 hub with 1Gbit ethernet port. This comes in handy if you also want to plug in your USB receiver for your mouse, Yubikey and clicker for example. I use this one and it works great (not for your Mac though!)

image (I have a Satechi, but this seems to be the exact same one)

Whenever you travel and have to present, at a conference or customer. You never know if your laptop will successfully connect to whatever AV equipment is set up. Always be on time and try out what works and what not. It happened dozens of times I could not connect successfully at once or only at a very weird resolution. What has helped me was to use this little adapter.


Even if there was a mini display port available, it happened to me I still had to use the HDMI to get the correct connection and audio to work for example. This thing works great on both my surface devices I am bringing. For the Mac, I will carry a USB-C version with VGA, Ethernet, HDMI, and USB. Yes, VGA is still used by a lot of our enterprise customers.

The Logitech presenter has been in my bag for years. Useful to have a remote clicker and on top of that, it has a laser!


When traveling it’s always useful to have a battery pack for your mobile. Even cooler and useful is a battery pack which is also a wireless router or bridge. This is the TripMate Titan. I have the 10400mAh version.


Besides being able to charge your phone. It can also work as a wireless router. Plug in a network cable and you can wireless connect your devices. Both useful in your hotel! but also on stage when you don’t have good coverage. It works without being powered, but you might want to hook up a USB cable just in case. The device is also capable of creating a wireless connection (to your hotel network) and still use it as a wireless hotspot for your own devices, so they can share the same wireless connection.

In the past, I always threw a US power strip in my suitcase and connected that to the power outlet with a travel adapter. The Mogics Power Bagel is something I haven’t used and bring with me for the first time.


It’s very small, has it’s own travel adapter and you can connect 4 plugs and 2 USB devices at the same time. Since it’s round you won’t have a problem to plug in the larger adapters. It extends a little extension cord when you use it as well.

It’s always useful to have a spare ethernet cable handy. For hooking up your laptop in the hotel room or connecting my wireless router to the wall. I bought a set of cable matters retractable ethernet cable since they roll-up so nicely.


I also always bring a mouse. It’s just easier for me than a trackpad. The Microsoft Arc mouse is a favorite. Also since it’s flat when you pack it.


If you are planning to rent a car. I always bring a car USB charger to be able to charge my phone, especially when you are using Waze for navigation.

Of course, USB cables to charge my phone.

Lastly, I have a set of noise cancellation earphones. I use the Bose QC35 II (No Surface headphone yet). It’s also great to use for Teams calls when you are on the road since it has a microphone as well. Priceless when you sit in a place for 20 hours. I also have a pair of in-ear ones which I can use when I want to sleep.


The last thing I pack is my Kindle Paperwhite. Without it it’s really hard to get through all those hours on the plane and nights in the hotel.


So what are your most important travel gadgets? Let me know in the comments.

How to detect if your devices are trying to circumvent your pihole

As I described in my previous blog post, you can set up a pi.hole DNS server to optimize your network traffic and your browsing experience. But not every device will be respecting your DHCP DNS settings it seems. Some devices have hardcoded DNS entries and just ignore your settings. Scott Helme wrote on his blog how to redirect those naughty devices and redirect their traffic to your pihole instead.

But before we start doing that I was curious to find how many of those devices I actually had on my network. To figure this out I had to setup my USG firewall to catch the TCP/UDP request on port 53 which are not originating from my pi-hole (on IP address The USG firewall can be configured to log certain events on your firewall (without blocking the actions). This will show up in the log file on your USG. The log file can be found in /var/log/messages. You can view this file with the command:

tail -f /var/log/messages

Depending on your firewall configuration you will see almost nothing or a ton of information coming by. The goal is to capture these kind of events:

Oct 21 17:53:42 USG kernel: [WAN_OUT-2000-A]IN=eth1 OUT=eth0 MAC=80:2a:a8:f0:0a:49:94:9a:a9:23:23:40:08:00 SRC= DST= LEN=58 TOS=0x00 PREC=0x00 TTL=127 ID=59302 PROTO=UDP SPT=58633 DPT=53 LEN=38

What you see here is a request from IP address doing a DNS request (DPT=53 meaning destination port 53 which is the port a DNS server listens to) to the DNS server at IP address

A legitimate event would look like this:

Oct 21 17:55:05 USG kernel: [WAN_OUT-2000-A]IN=eth1 OUT=eth0 MAC=80:2a:a8:f0:0a:49:b4:fb:e4:8c:32:67:08:00 SRC= DST= LEN=57 TOS=0x00 PREC=0x00 TTL=63 ID=20414 DF PROTO=UDP SPT=23724 DPT=53 LEN=37

This is a DNS request coming from my pihole server on and it’s configured to forward DNS requests to

Let’s set up the firewall to start generating these logs in your log file. I have done this with Unifi version 5.9.29 Go to your cloud key settings page. Click Routing & Firewall. Click on firewall on the top of your screen. Click WAN OUT and click on Create New Rule. This is how my screen looks like:

At the buttom you have to create a new Port group for the Destination. Click on create port group button and create one for DNS like I did below:

Make sure you click on the Add button after you filled in the port number (DNS listens to Port 53) before you hit save. Click Save again, This will cause your USG to be provisioned. SSH into your USG.

To only see all DNS request in your USG log file you can use the following command:

tail -f messages |grep -F “DPT=53 “

This will show any DNS requests going out to the internet, including the ones from your pihole. To only see the naughty devices you can use the following command (another grep, perhaps there is a more efficient way but this worked for me :)) where the IP address is the IP address of your pi-hole:

tail -f messages |grep -F “DPT=53 “| grep -v “SRC=”

This one takes a while before it starts showing the log, but it worked for me. Now you will only see the DNS requests coming through your USG from your naughty devices. So how do you test this? The following command performs a DNS request and you can add a DNS server where the request is sent. This is a great way to test your setup:

nslookup techmeme.com

So far I have only seen a Samsung Galaxy S7 going to a Google DNS server directly. So the devices on my network seem to be well behaved.




Installing pihole on your Cloudkey gen2+

The other day I bought myself a Gen2 cloudkey plus from Ubiquiti and replace my old cloudkey. It comes installed with the Unifi SDN and the new Unifi Protect. The device looks really nice and has a little display which shows you information about the applications running on the device.


Since I have been playing with pi-hole lately on one of my Raspberry Pi’s, I was wondering if I could install pi-hole on the cloudkey so I would have everything from my network on a central place. With help of Google I managed to get it working by following the steps below:

First you have to install a DNS server on the cloudkey, since that’s used by the pi-hole software. ssh into your cloudkey and enter the following commands:

sudo –i

apt-get update

apt-get install dnsmasq

Than we can install the pi-hole software. I choose to download the install script and execute it on my device.

cd /tmp

wget -O basic-install.sh https://install.pi-hole.net
bash basic-install.sh

Keep all the defaults. the only thing I had to do was say no to keep the ip address from DHCP since it didn’t copy the IP adres, I entered it myself. During the install the lighttpd webservice will be installed too. This is used by the admin page.

Last thing is to change the default port of the website since that’s already taken by the cloudkey management interface. During pihole install lighttpd was installed

make a backup of the config:

cp /etc/lighttpd/lighttpd.conf /etc/lighttpd/lighttpd.conf.backup

sed -ie ‘s/= 80/= 81/g’ /etc/lighttpd/lighttpd.conf

or use vi/nano to edit the config file and change the server port

restart the webserver

/etc/init.d/lighttpd restart


http://<IP>:81/admin should bring up the pi-hole interface


Every time you run the pihole install you have to set the port of the webserver back to a non 80 port again


Let me know if this works for you or if I forgot to document a step.

New job in the Azure Identity team

Just posted the email to my colleagues and send an email to our wonderful Windows Development MVPs. Today is my last day in Windows (DEP, developer platform team). I am starting a new job in the Azure Identity organisation in the CxP team. I will be working with developers to evangelize and drive adoption of our Azure Active Directory platform. The full job description is below:


Senior Program Manager

Azure Active Directory Premium, B2C

The Digital Transformation era is upon us! Applications and data are moving to the cloud; employees want to be productive on devices they love from locations of their choice; organizations want to give seamless access to employees and partners; self-service is in and helpdesks are past. In the middle of all these exciting changes, security breaches are getting more sophisticated by the day. The single common factor in this journey that our customers are undertaking is … Identity.

The @Scale CXP team in the Identity engineering division within Cloud+AI works with partners, developers and customers from all over the world to drive service adoption and we work directly with engineering to shape the product. The best of both worlds!

As Microsoft cloud services adoption continues their rapid growth, Developers play a critical role in helping to drive usage of our services. Developers are at the center of enabling key customer scenarios building solutions ranging from enterprise scale applications and services to niche departmental business process apps. Assuring Developers have the technical skills and Identity developer platform necessary to build and sustain a vibrant Identity business is extremely important to our shared success. Assuring our developers needs are evangelized throughout our engineering organization as part of the engineering lifecycle is critical to our long-term business growth and sustainability.


In this role you will help drive usage and adoption of the Identity dev platform by supporting awareness and growth of product expertise within the developer ecosystem, and define, build, and execute on engagements with developers to get feedback, evangelize their product needs, and drive enhancements through the engineering lifecycle. This work is instrumental for our business to learn from developers across the globe as we understand how our technology is adopted. Our world evolves at the speed of cloud and we are looking for active learners who can collaborate across a diverse team and global business.

Key Responsibilities:

Evangelize the Identity developer platform and drive its adoption

Drive usage: More active third-party apps built on the Microsoft Identity developer platform getting used more broadly across a larger customer base.

Drive engagement model with B2C developers to grow the inventory of apps in our marketplace, remove technical roadblocks and discuss product roadmaps. Connect with developers at major Microsoft or Industry events and road shows.

Own Technical Enablement and Readiness: Drive Identity dev platform awareness through calls, webinars, office hours, Yammer, training sessions, etc.

Define performance measure to provide our Identity leadership with crisper actionable insights.

Channel Developer feedback to the feature teams to help with prioritization.

Track and improve Developer satisfaction with our platform.

Partner with other Microsoft teams to align with their developer ecosystem strategy.

Regularly report out on impact and opportunities.


Basic Qualifications:

Minimum seven years of work experience in the computer software industry including two years of technical experience in security, cloud, and/or identity solutions.

Bachelor’s Degree in computer science or related discipline, or equivalent experience.

Preferred Qualifications:

Ability to Ramp to L400+ on Identity Platform Technology

Direct experience working with developers is highly desired

Collaboration/ in cross-teaming skills.

Comfortable working autonomously in a fast-paced environment where new challenges exist around every corner.

Ability to prioritize, time management and organizational skills.

Ability to take on complex systems and processes and drive simplification and improvements.

Self-starter, who can deal with ambiguity, maintains focus, drives to clarity and provides innovative solutions.


I had a amazing time in Windows. The last year working for one of the best managers I had in my career (thank you Lora!). I am going to miss working with the fantastic Windows Developer community and I hope our paths cross again. I will take some time off before I start the new role. Lots of new things to learn and I can finally talk and blog about my work again, so I expect to take you along, on my blog, during the Azure Identity journey I am about to make.

Adding FlightRadar24 feed to my FlightAware raspberry pi PiAware install

Since a week or so I am running PiAware from FlightAware on 1 of my Raspberries. It’s running fine. Thanks to Chris Johnson I also managed to feed Flightradar24 from the same feed. This are the steps I did on my raspberry through the shell. I don’t run a fancy container solution like Chris does on his setup so I had to steal some configuration and instructions from his github page.

This were the instructions I pasted in my sudo shell window:

To configure the feed type:

Enter your email address, leave the next blank, enter your latitude, enter your longitude, enter your altitude in feet, enter ‘yes’ to confirm and the ini file will be filled in for you.


and you are set. You can check the /var/log/fr24feed.log file to see if everything is working correctly.