As an introduction, I've written already several articles here about the Microsoft Cognitive Services. By combining the power of machine learning and Xamarin, you're able to make your apps smarter and more personal just like I showed before. All of this led up to a presentation that was given at dotNedSaturday, organised by dotNed, the largest .NET User Group from the Netherlands.
This presentation will dive into the power of Microsoft Cognitive Services in combination with Xamarin build apps for your phone. By doing this, apps can be made a lot smarter since you’ll be able to tap into machine learning with ease. Apps can be a lot more personal as well, as if it’s your assistant that’ll be more than just Siri/Cortana/Google Now. All of that in a full cross-platform mobile fashion using Xamarin.
During this session I’ll dive into the different available options from the Microsoft Cognitive Services, as well as giving a hands-on demo on how to integrate this in a mobile app. The sample code can be found on Github. Take note the presentation itself is in Dutch, the slides are English. Enjoy!
Tags: cognitive xamarin microsoft presentation
One of the things Android had for a long time were App Widgets. With these small apps, you're able to create byte-sized functionality that users could use from their home and lockscreen. Since Android 5.0 the support for lockscreen app widgets was removed, while iOS10 introduced them again.
In this article we'll dive into the way how you can create your own App Widget using Xamarin.Android. This article will cover a Hello World-example which you can take as a start to implement in your next project.
For this demo, I wanted to create a simple app widget which displays an icon, text and the current time. When you click the widget, the time will refresh but when you click the icon, an app will start. Let's see how we can create such an app widget! The source code can be found on Github.
Tags: widget xamarin android
As an introduction, I've written already several articles here about ATDD/BDD with Specflow and Xamarin. All of this led up to a presentation that was given at Microsoft TechDays in the Netherlands. Here, I'll show how InfoSupport works together with the Dutch Railways how we Turn specifications into high quality apps.
During the presentation we'll discuss the Three Amigo's, how specifications are written using Gherkin and being automated with Specflow and Xamarin.UITest. We even managed to introduce a new buzzword: Acceptance Driven Presentation (ADP).
The sample code can be found on Github. Take note the presentation itself is in Dutch, the slides are English. Enjoy!
Tags: atdd bdd specflow xamarin presentation
Like I said in my previous article, the power of Microsoft Cognitive Services is amazing. With just minimal effort having machine learning at your fingertips is extremely useful. I wanted to dive even deeper in Cognitive Services and was intrigued by the Entity Linking Intelligence Service. I already knew the Computer Vision API could do some OCR and I wanted to combine these to try something that I already thought of years ago.
When you visit a museum and you look at a painting, 9 out of 10 times there's a sign next to it with some information about it. Although many art experts will know exactly what painting technique is used, who the artist is, why he painted it etc., I'll be needing more information than what's on that sign. This is where Entity Linking will come in useful.
Once again we'll be using Xamarin.Forms with a focus on iOS (but Android and/or UWP will work the same way). The source code can be found on Github.
Let's see if the combination of these cognitive services can help us to achieve this goal!
Tags: ocr entity linking computer vision cognitive services xamarin microsoft
Ever since the Microsoft Cognitive Services were available, I always wanted to give those APIs a spin. The power of machine learning at your fingertips, that's pretty awesome! Today I managed to hook up a Xamarin app to the Computer Vision API to do some image recognition. The basic idea of this app is really simple: Let the Computer Vision tell you what you're looking at. Simply take a picture, pass it along the Computer Vision API and display the result in the app. It'll tell you what it thinks you're looking at.
Because this is a simple demo, we'll be using Xamarin.Forms. Although I'll only focus on iOS here, you can easily extend it to Android and/or UWP. The source code can be found on Github.
So let's see how you're able to use the Computer Vision API inside your Xamarin app. I'll give you a small hint: it's really simple!
Tags: computer vision cognitive services xamarin microsoft mobile
|<< Start < Prev 1 2 3 4 5 6 7 8 9 10 Next > End >>|
|Results 1 - 15 of 373|