Arthur Wang's Blog
Follow me on
  • My General Blog
  • Software Development
    • Latest Articles on Software Development
    • Complete Resources for Developers >
      • Tutorial Information for Developers
      • .NET Developer Blogs
      • Developer Journals and Magazines
      • Developer's Tools
      • Database Development
      • ​Developer Conference and Events
  • Tech
    • Latest Articles on Technology Development
  • Health
  • Money
  • Services
    • Modern Website Design
    • Web Maintenance of Existing Websites Service
    • Corporate Business Consulting Service
  • About
  • Contact
  • Art
  • 中文部落格

Use Katana OWIN OAuth 2.0 to Secure your API Connection and Authentication Plus Refresh Token for .NET Developers

12/21/2016

0 Comments

 
Picture
In this article, I’ll be showing you how to use OWIN 2.0 specification to secure ASP.NET Web API v2 from scratch. If you are looking for a token-based authentication and dual authorization based on claims or finding a way to have an independent self-host without using the IIS security architecture, then this article is for you.
​

There are tons of information and sample codes out there in the wild, but with the rapid pace of advancement in technologies over the year, many sample codes become old or not working anymore. I feel the results from the search engines were like archeological sites that contain different things from different eras. One has to sort it out to reason it.  Since time flies quickly, this article will be outdated one day as well and be dumped into one of these archeological sites just like the others, but I’ll show you exactly what technologies I used to build the sample codes in details. We're going to build the latest token-based authorization / authentication for modern apps that are self-host and .NET focused solution.  The objective is to show you how OAuth 2.0 authorization work from requesting the access token and use it to access protected API and then see the refresh token in action. Many security practices have omitted and we only show you the minimal code to achieve our objective and it cannot be used in the production environment as is. I assume you will do your own database and security strategies.  Lastly, we will also show you how to use tool to communicate with our OWIN/OAuth solution and develop a simple console to interact with the host.

Is it just me?

There are many confusions in the internet describing a simple OWIN working solution. Many samples contained old technologies and mixed with other infrastructure like data model and other unnecessary components to make it work. The OAuth/OWIN technology is somewhat simple and elegant, but the learning experience could be frustrating if you are new.  It seems like some people on the internet purposely try to confuse you so they can sell you a paid service to do an OAuth solution? Never mind the conspiracy theory, before we dive into details, it is imperative to know a few things to see the big picture of how OWIN and Katana can help us to build our solution.

What is OAuth 2.0? ​

OAuth 2.0 is a protocol that is independent of Microsoft, and it provides an authorization framework that enables communications between two or more independent HTTP services like Web API. Many open source communities and vendors like Microsoft develop their own OAuth 2.0 solutions based on this specification.  The specification describes how a requestor requests for an access token from the authorization server, and what keys and values needed for its submission through a POST action via HTTP and the policies on how the authorization server should respond to the request.
One of the confusions about learning the OAuth or OWIN is not because of the OAuth itself but because of its flexibility.  Great flexibility is sometimes not a good thing for beginners. Many components are created to work interchangeably in the OAuth framework, and many such components are open source with different names.  By looking at the names themselves, one cannot comprehend its role in the framework.  In this paper, we will be focusing on the Microsoft’s solution on OAuth.

Order your dinner to go tonight?

If you still have problems of understanding them at this point, let me put them in some analogies.  OAuth is just like a dinner, where Microsoft’s OAuth is like Japanese Food, and katana is like the Terri Chicken with Shrimp Tempura.  So you could have Mexican Food or Chinese Food for your OAuth dinner.  There are many mixed combinations of components to create your own special dinner. I apologized if I further confused you, but please continue reading and keep these analogies in mind.

OWIN for .NET Developers

Microsoft creates an open standard called Open Web Interface for .NET or known as OWIN, and the actual implementation of the OWIN is called Katana. 
For examples, just for OWIN itself, there are names like Katana, Nancy, Jasper, Suave, Nowin, ACSP.NET, Freya, ASP.NET Web API, ServiceStack, HttpListener, and the list can go on and on. Some components are deprecated, and some belong in one of these components: Host, Server, and Middleware. 
Katana is a collection of OWIN-compatible components that make the whole architecture. Our perceptions about host and server have changed.  You should think Server and Host as functional components that serve other components in the architecture, rather than hardware server or IIS web server.  The Host manages the whole environment from initiating to launching the process. An example of the server will be the authorization server that takes care of authorization and granting token at the end.  The Middleware contains layers of various frameworks that manipulate the ins and outs of the properties in the pipelines. Each framework can be a function or act as a smaller application for a complex need, or this bare framework can be just a simple DelegatingHandler or a special Func Dictionary.  
using AppFunc = Func<IDictionary<string, object>, Task>; // Enviornment data in dictionary and Task done
app.Use(ctx, next) => { await ctx.Response.WriteAsync(“<html><head></head><body>Hello guys!</body></html>”) }); 
where ‘app’ is the IAppBuilder in the Configuration() and ctx is the OwinContext(environment) and ‘next’ is AppFunc. 
The additional setting in the HttpConfiguration object will be the last step or layer of the middleware.
Fortunately, with Katana, we don’t need to write a lot of codes. When you install the System.Web.Http.Owin assembly, you can use UseWebApi method derived from WebApiAppBuilderExtensions class to complete our pipeline by binding middleware together. Because of the Web API’s host adapter design, it allows components to be arranged in a pipeline structure and allows decoupling of other components so every component in the Middleware can perform different tasks in a request or response.  With an optional ‘scope’ properties, the developer can further scope certain APIs or Middleware layers to smaller tasks. 
​

   var config = new HttpConfiguration();
   app.UseWebApi(config);
​

Please note that all methods used in the Middleware are all asynchronous task-based method. If there an error has occurred, it should immediately return an error response to the caller rather than continue to the next pipeline.  The implementation of OWIN assumes that the communication is over an SSL/TLS connection. So we only set   AllowInsecureHttp = true in the development environment when we are setup the OAuthAuthorizationServerOptions portion of the codes.   

Let's start coding Katana

You need to have at least Visual Studio 2013 or above to build this project.  We are going to create two separate solutions, and each solution is going to have one project.  One project is for making the Host, and the other project is for building the Client. Now let’s create the Host project.
Project 1: Create AW Katana Self Host Server
The purpose of this project is to create a self-host server with katana OWIN spec that is minimal for grant requestor access token and process refresh token without the database.
​
Visual Studio 2015 Community Version
Project Type: Windows Console Application with .NET Framework 4.5.2
Project Name: AWkatanaSelfhost
Package Install:
Microsoft.Owin.Host.Systemweb
In Package manager console:
PM> Install-Package Microsoft.Owin.Host.Systemweb
Install-Package Microsoft.AspNet.Identity.Core
Install-Package Microsoft.AspNet.Identity.Owin
Install-Package Microsoft.Owin.Security
Install-Package Microsoft.Owin.Hosting
Install-Package Microsoft.AspNet.WebApi.Owin
Install-Package Microsoft.Owin.Host.HttpListener
 

1. Create a new project, Select Console Applicaton with .NET framework 4.5.2
Picture
2. Create two folders: Controllers and OAuthProviders
Picture
3. Open Program.cs file, and we will build a self-host web server in here. By using Microsoft.Owin.Hosting, you can add WebApp object in it and instruct the host to start the application from Startup class, which we will be building in the next step.
        static void Main(string[] args)
        {
            string baseUri = "http://localhost:8000";
 
            Console.WriteLine("Starting web Server...");
            WebApp.Start<Startup>(baseUri);
            Console.WriteLine("Server running at {0} - press Enter to quit. ", baseUri);
            Console.ReadLine();
        }
​
​4. Create a class called “Startup.cs” and inside of this file, add two references: System.Web.Http and Microsoft.Owin.  This is the place where we are building the OWIN with Katana component architecture. Using Microsoft’s OWIN IAppBuilder at the first method called configuration, we can build the OWIN HTTP pipelines.  There are just 3 separate tasks to build it. One is to configure the IAppBuilder app by supplying it with information like the token path and provider and custom options for Authentication pipeline since we are going to build our own token and refresh token. Second is to map the route of our resource. The third task will be bring the route information from the HttpConfiguration into the last pipeline.
​
        public void Configuration(IAppBuilder app)
        {
            ConfigureAuth(app);
            var webApiConfiguration = ConfigureWebApi();
            app.UseWebApi(webApiConfiguration);
        }
 
5. Create MyOAuthServerProvider class in the oAuthProviders folder.  This class is the brain of the entire architecture, where it validates the incoming credential against the security data we have on the server.  First, it analyzes the incoming data and determines if this is a new requestor for the access token or a return requestor who requests the renewal access token by using the refresh token. The ValidateClientAuthentication method will decipher the incoming data and determine the next action.  If it receives user and password and the grant type is password, it will pass to GrantResourceOwnerCredential method for further verification and will determine to grant an access token or reject to the requestor. If it receives refresh token and the grant type is refresh_token, then the GrantRefreshToken method will receive the call from the ValidateClientAuthentication method, and then it will issue a brand new ticket containing the new access token when it is validated. 

6. Create MyRefreshTokenProvider class.  This class is self-explanatory where we implement the IAuthenticationTokenProvider interface from the OWIN.Security.  Here we can customize our refresh token.  In our project, we just create it as GUID data type. 

7. Create a simple API called FruitController as our resource where the requestor can access our secret Fruit List after their credential has been verified and use the obtained token to access the API.  There is no need to pass the username and password again when accessing the protected resource.  The API can be protected by simply using the [Authorize] attribute in front the controller or individual method.
​
The Self-Host project is now completed with 7 simple steps.

​The Secret Recipe of Refresh Token

Many articles and code samples explaining the OWIN usually stop at how to generate the access token and did not reveal the mechanism of how the refresh token work in codes.  The trick is not in the MyRefreshTokenProvider, but rather, it’s in the MyOAuthServerProvider class.  The OWIN’s specification said the only required parameters are “grant_type” and “refresh_token” like showing below:

   grant_type: refresh_token
   refresh_token: 3a3aebea-4150-4850-8e37-ace1d9eead9a [this is our sample and you may have other format]

There are many ways to accomplish the same thing, but in our project, the trick is to have another authentication property called “as:client_id” hidden in the original ticket when the requestor requested for the first time.  When the return requestor comes again with the refresh token and asks for a new access token, the ValidateClientAuthentication method can verify the clientId and clientSecret against the original ticket so that we are sure that this requestor is the original requestor.   Without this trick, the GrantRefreshToken method will never receive the call even the grant_type and refresh_token parameters have been passed in. A generic error message, such as “invalid_grant” can be resulted, and you may not know why.
Project 2: ​Create AW Katana Client
This Client app will be the requestor for the AW Kantana Self-Host.
Create another .NET console project in another Solution, and we are going to build the Client that requests the access token to access the protected resources.
​
Project Type: Windows Console Application with .NET Framework 4.5.2
Project Name: AWkatanaClient
Package Install:
PM> Install-Package Microsoft.AspNet.WebApi.Client

I’ve created two separate methods in the Main(). One is to demonstrate how refresh token works, and another method demonstrates how to access protected resource.  You can comment out one of them to examine the mechanism of token generation and consumption.  Please run the AWKatanaHost project first and then AWkatanaClient later in order to have a correct testing experience. See the results below:
Picture
Picture

​How to use Postman to test your Katana Host?

Postman is a tool that you can install or add to Chrome as an add-in.  Basically, Postman acts as a client that passes an HTTP POST with your desired parameters in the header and body to the host.  Before you use the tool, you need to have a solid understanding of how client-server communicates.  The communication is made possible because of HTTP protocol contains rules and information in their request/response body which dictates how client like the browser to behave and how the server should handle.  To test our AWkatanaSelfhost project, we need to use “POST” instead of the “GET” action. Download Postman

​How to emulate a client requests for the access token for the first time?

​1. Change to “POST” from the drop down list
2. On the url text box: http://localhost:8000/Token
3. Click on “Body” tab, and ignore “Authorization” and “Headers” tabs
3. Select radio button: application/x-www-form-urlencoded
4. Add the following keys and values
grant_type: password
username: [email protected]
password: enterprise
client_id: 12345
client_secret: secret
5. Click on the “Send” button
Picture
Emulating a client requests for the access token for the first time

​How to emulate a client to access the protected resources?

​1. Copy the access_token value from previous response body [yes, the whole thing; 3 lines]
2. Create another tab and change to “POST”
3. On the url text box: http://localhost:8000/api/Fruits
4. Click on “Headers” tab, and put this key and value [Remember: in headers and NOT in body]
Authorization: Bearer AQAAANCMnd8BFdERjHoAwE_Cl-A..<---your access token code here
5. Click on the “Send” button
Picture
Emulating a client to access the protected resources, ../api/fruits

​How to obtain the new access token from your refresh token?

​1. Copy the refresh_token value from the first response body [e.g. 870dd360-f41e-48e9-91d7-2790b0dc11aa]
2. Create another tab and change to “POST”
3. On the url text box: http://localhost:8000/Token
4. Click on “Body” tab, and put this key and value
grant_type: refresh_token
refresh_token: 870dd360-f41e-48e9-91d7-2790b0dc11aafrom step #1
client_id: 12345
client_secret: secret
5. Click on the “Send” button
Picture
Emulating how to use your refresh token to obtain the access token again

Summary

I hope this article is fun and helpful for you to learn to use OAuth 2.0 to secure your API services by using Katana.  You can download the codes from here:
  • AWkatanaSelfhost
  • AWkatanaClient

​Useful Links
Postman Tool - https://www.getpostman.com
OAuth 2.0 Official Standards- https://tools.ietf.org/html/rfc6749#section-4.3.2
0 Comments

Taming the Wild Wild Web Development Tools like NPM, Bower, Gulp, Grunt and Node with Visual Studio 2015

5/26/2016

0 Comments

 
Picture
​If you are new to Visual Studio 2015 and wonder why all these external web development tools, such as Bower, NPM, or Grunt, were included in the new Visual Studio, then this article is for you.  There are tons of articles written on how to use specific tools and recommendations on which one is better than others.  In this article, we want to try to describe the ecosystem of these modern web development tools in a big picture and how tools have become complicated over time.  The quick and straightforward examples shown here will help you to get started with these tools and understand the reason behinds.

As the ASP.NET developers, we are used to having .NET Framework version upgrades and additional features like NuGet Package Manager and more whenever a new version of Visual Studio comes out.  But this time is not like that. The familiar folder structure is gone, and the web.config file is gone. The folder structure is quite different. There is even a ghostly “Dependencies” folder which appears in the Visual Studio Solution Explorer but not in the physical folders and there are unfamiliar default files [See figure below]. People even seem to move away from using Web Essentials, the Visual Studio Extension, which was a way to use the external tools outside of Visual Studio.  And even the use of NuGet Package Manager is discouraging, and instead, we are seeing more command line tools flourishing in the web development landscape as it seems like we were going backward to early DOS age after all these years of advances.  Are we going back to square one and what’s going on?
Picture
Showing Solution Explorer in Visual Studio 2015
To understand this, we must look at from the web development ecosystem perspective and cannot just look at Visual Studio itself or thought that Visual Studio just want to add some newer tools like NPM or Node.js into their product. We must recognize first that the software development as a whole is now undergoing a rapid growth phase.  New tools or upgrades are coming out on a weekly or even daily basis.  We have never seen such fast pace in the web development when comparing in the past years.   From staying competitive from a product perspective, Visual Studio needs to acquire new external tools and adopt latest web development trend to modernize its product and maximize the number of users. 

Moving away from NuGet

As the Visual Studio users, we are used to let NuGet manage our dependencies, and there is no doubt that it was a great tool.  However, NuGet is gearing toward for the Microsoft .NET ecosystem, and not all client-side libraries are submitted to the NuGet repository.  The Visual Studio users may cut from the outside world and unable to obtain the latest packages or new technologies.  As a result, Visual Studio needs to align itself to adopt the thriving development community. They do not have the resource to chase the trend and reinvent the wheels.  With the time constraint, Visual Studio still finds ways to integrate these new tools into the web development.
 
There are two popular package managers, Bower and NPM, tend to replace the NuGet tool, at least for the non-.NET component such as JavaScript or CSS.  You just need to choose one or both to manage your dependencies. NPM can be used for both server-side node packages and client-side packages.  Bower is more popular client side package manager..

If you use Visual Studio 2015, NPM, Bower will be installed by default.  And you can manage these tool versions from Tool --> Options --> Project and Solutions --> External Web Tools  
Picture
​This list shown on the right from the Option window tells Visual Studio where to find dependencies starting from top to bottom. So the Visual Studio program looks for node_modules\bin folder from your project first, and if it cannot find it, it will look for the folder under C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\Extensions\Microsoft\Web Tools\External and so forth.
​If you do not use Visual Studio, then you need to install them yourselves in this order.
1. Install node.js from nodejs.org, and it will also install npm by default
2. Install with a command line to install bower by using npm [npm install bower –g]
3. Install Git by using http://msysgit.github.io/  and use Git to download packages from GitHub.  

​Where are the confusions?

​For Windows users, we are so used to have no-brainer installation in the Windows environment.  Just double click and install. But to install a web development tool without Visual Studio 2015, there are so many steps just to install these tools, and we have not even talked about how to use them yet.  Life is easier in Visual Studio 2015.  Before we talked about how to use them in VS 2015, let’s compare Bower and NPM since they are doing the similar things like package managers, but it really dependent on your needs.  For example, if you need to keep two versions of jQuery in the production, then you need to use NPM, but if you need only one single copy of jQuery, you should use Bower since it just installs the designated version packages that you specified. So initially NPM is used for installing Node.js modules, and Bower.js is leaning toward on the use of managing the front end components like CSS, JS, etc. 

Here is how npm works:
 
There is a registry on the internet (npmjs.org) where tool developers can publish their works, and it is powered by the CouchDB database.  Once you installed the node.js on your local machine, from a command line, you can invoke npm.exe to manage your packages.  In your project folder, you will need a meta file named package.json that instructs npm on what to do.  The npm will go to the npm registry to find your missing packages and bring them into node_modules folder on your local machine. 
Picture
​So in Visual Studio 2015, all you need to do is to open the package.json file, and add the name of the package and its version under the devDependencies section, and as soon as you hit the save button, Visual Studio will act as a proxy to npm to grab those packages for you. 
Picture
Showing the content of package.json used by npm.exe
But if you preferred doing it manually, you can do it in the Package Manager Console inside the Visual Studio.  
Picture

​The Build System

​As you may have notice that each package manager has its own destination folder for the packages that it stores. Bower stores in the bower_components folder and NPM stores in the node_modules folder. In this case, we need to have another build step to copy those packages into our web development or production folder from the package folder.  For example, if we use AngularJS in the development, our AngularJS files may be located in node_modules folder if we use NPM, and we want packages to be located in our wwwroot/js folder.  Grunt and Gulp are the two of the most popular build systems out there to do these tasks. However, NPM can act as a build tool as well if you use its createReadStream function.  Nevertheless, if you need more complex build process, many people will likely choose either Grunt or Gulp since it has a lot of plugins to choose from.  In Visual Studio, these tasks located inside the Grunt or Gulp script can be managed via Task Runner Explorer.  You can do more inside the Task Runner Explorer when you right click on the task.
Picture
Conclusion
 
In this article, we have seen how these little tools such as the npm, grunt, and many others, are quite powerful, and they can do a lot of things for us as developers.  The tools help us from managing our dependencies to our build process, and they could also help us test and do continuous integration if we spend some time to configure them with scripts and a workable deployment pipeline can be built at the end. However, because of these tools live in an open source environment, there is no one to supervise and manage, it has its advantages, and there are some disadvantages as well.  When you need a tool's plugin to do something, you may find that you need more dependencies after dependencies, and sometimes it may add unwanted file size to your project.  There are more tools to fill the gaps of other tools. There are management tools to manage other management tools.  There are fixes and upgrades within the tools, configuration scripts may work for a few months and may not work if you upgrade certain tools.  It is an age of disruptor overboard, and it is quite a chaos if you think about it.  Regardless of the frustration that some developers may have experienced, we need to do our part of finding the right tools by testing and prototyping the workflows that fit our needs.  We should always have backup copies of these tools stored somewhere you can access since the process often assumes that you have an internet access.  On the security issue, we should also be alert to the risk of having such an easy but powerful tool like the npm.  Since node.js was already installed on our side, and this is always a vulnerability of allowing the malicious package to come in without any supervision.  Hopefully, the technology will become more mature and reliable sooner. ​The developers can spend more time on building great software and spend less time on finding the right tools for the right jobs and then figuring out how to use the tools to their best interests.  Perhaps, Visual Studio can tame the modern web development tools in the future version.
0 Comments

Understanding the New .NET Core Technologies

4/27/2016

0 Comments

 
.NET Code Life Cycle in Desktop and UWP Applications
.NET Code Life Cycle in Desktop and UWP Applications
To understand the new .NET technologies, we must first understand the old one in terms of the life cycle of the code from design, build, deploy, to runtime.  As we have mentioned previously, one purpose of the old .NET technologies was trying to solve a problem in which that the operating system can accommodate source codes written in different programming languages and deployed the application codes in the form of DLL files.  These applications are either Windows Desktop applications or ASP.NET web applications that can be executed by the processor when the user interacts with them on the Windows-based computer or server.  At runtime,  the CLR executes these DLL files in a Just-In-Time (JIT) fashion.  As a result, the startup time for any Windows app is always slow since it needs to do the compilation of the entire application at the first time it loads.  
​

In the .NET Core technologies, one major purpose is to be able to make the source code runnable in multiple platforms regardless of the type of devices: laptop, phone, tablet, and HoloLens.  The goal of CoreCLR is to support other operating systems as well. For example, an application written in C# code can be deployed and run on Android phone or iPhone or even on Linux. In contrast, CoreCLR is not used for building for the desktop application, or at least for now.  It creates Windows Store application or called the Universal Windows Application and ASP.NET apps and services.    

Only a subset of CoreCLR is merged with your app code to produce the final native code.  In this way, it minimizes the executable file size and processor's load, and it was already compiled ahead of the time and serves static codes to the processor instead of dynamically. Consequently, the startup time is much faster than the old CLR method.  We will describe how it works in detail next.  

Prior to Visual Studio 2015, there is no big difference between the Debug mode and Release mode when you are running your application in term of the process of building the application.  The build process is similar except there are more debugging features participating in the process.  In the Visual Studio 2015, there is a significant difference between these modes on how the build process is done.  In the release mode, it goes through .NET Native process where the .NET Native tool chain compiles the IL (Intermediate Language) binaries into native binaries.  Unlike the old way, this time, you need to specify the CPU architecture that this app deploys to.  In the Debug mode, you are running in a similar fashion to the previous Visual Studio, but instead of using the CLR, you are using the CoreCLR, a brand new .NET Core Runtime, to compile and run your codes.  This new CoreCLR supposes to give you fast compilation and deployment and rich debugging and diagnostics features.    

Unlike the old way where you package your final codes into executable files, such as DLL or EXE, which look identical to the files running in the release mode, you will need to fill some additional information about your application on a submission form and a package will be created by the Visual Studio for submitting to the Windows Store.  The final package you are submitting to the store is not in the form of native binaries, but they are IL binaries instead.  It is because the Windows Store will take care of the final build for your application.  It creates different app packages by using the .NET Native tool chain depending on what CPU architectures you specified on the submission form that your apps can run on.
Read Part 3 of 3: ​Understanding .NET Native Tool Chain Under the Hood
0 Comments

The New ASP.NET 5 is Back to 1.0 and Is it a rename or a reset? (ASP.NET Core 1.0)

2/21/2016

0 Comments

 
Picture
​The ASP.NET 5.0, was suppose to be the successor of ASP.NET 4.6, but all of sudden, the ASP.NET Core 1.0 will be the next version.  This does not sound right if it's done just for the rebranding purpose.  It looks very confusing at first and perhaps this might be just another marketing gimmick for reviving the ASP.NET line of products.  We should look closer to this announcement (January 2016).  
 
This sudden change of name must be resulted from a heated debate within the Microsoft since if you look at the milestone of releases from Beta to RC, you will notice the trend of inevitable change from just merely for enhancing the ASP.NET 4.6 to completely focusing on supporting the cross-platform development on the .NET Core, a complete new framework that was rewritten completely from scratch (no code copying from .NET Framework 4.6).  This tight turn was resulted from the release of Beta7 on 9/27/2015 and onward.  This decision might be painful and could cause confusion, undesired development issues, and potential negative impact to developers by changing from ASP.NET 5 to ASP.NET Core 1.0.  For example, it disrupts the continuity of using the versioning-based NuGet packages since the name has changed.  It may also affect all the libraries in the framework due to namespacing changes.  Inside the Microsoft, they have to do a major refactoring by renaming their internal namespacing before they can release the product and expect to be prefect and error-free.  It is a daunting task to be completed within a short period of time and at the last stage of releasing a major product. 

​When Rebranding Becomes A Necessity
In the recent years, the ecosystem of various mobile devices continues to strive in a very fast pace.  Microsoft not only needs to adapt the changes in the ecosystem, but also quickly to innovate its products in order to stay competitive.   Since November 2014, Microsoft introduces the .NET Core designed to be the foundation of all future .NET platforms with a unprecedented implementation of the open-source development model.  It's a new era where Microsoft has begun to join the communities and offers its modern .NET stack in full open source condition.  The .NET Core is a new .NET framework build from ground up and it is built with the independent of platform in mind and need to be open source.  In addition, due to complicated issues of license and patent in the existing .NET framework since first version, this brand new framework is unable to just borrow the existing code from old framework to the new framework.  It has to have a complete new codebase if this needs to implement as a full open source framework!  Furthermore, since the purpose is to be the cross-platform framework, it must also revamp the process of compilation and execution.  In the old framework, the source code is compiled into Compiled CIL (Common Intermediate Language) code and stored in the CLI (Common Language Infrastructure) assemblies in the format of DLL and EXE files.  Upon execution, the platform-specific CLR (Common Language Runtime) compiles the CIL to machine-readable codes that can then be executed.  In the new framework, .NET Core, instead of using the CLR, it uses CoreCLR to do the execution for the cross-platform .NET programs due to its modular design to work with all platforms including, Linux, Mac OS X, and Universal Windows Platform.  By using NuGet, you can download the CoreCLR runtime along with its CoreFX libraries and then package and deploy them with your application.  In addition, you can optimize the deployment by including only the CoreFx libraries that you need.  This is a great leap since your runtime environment can be at minimal file size and it seems promising for the future tiny devices with unknown file size limit.
 
Consequently, the renaming of ASP.NET 5.0 to ASP.NET Core 1.0 is a must and reasonable.  Besides it is completely new and deserves to have its own timeline, it has a different architectural design which is no longer based on System.Web.dll, the mother of traditional ASP.NET, and it is designed and optimized for the cross-platform and cloud-ready environments.  You can even deploy your ASP.NET web app directly to a Docker platform, a containers-as-a-Service architecture.  The change should be good on all levels.  Now it is just like the old time when ASP.NET came aboard to replace the classic ASP in order to build web applications and not just web sites.  ASP.NET Core is slowly replacing ASP.NET in order to build applications that can run on all platforms, and not just on Microsoft platforms.  Nevertheless, it may be a short-term pain for the confusion and extra works for the developers, but from a long-term perspective, it should be good.
Roadmap of ASP.NET and ASP.NET Core
Will there be ASP.NET 4.7? 
For the future, it is appropriate to ask "Will there be ASP.NET 4.7?" since ASP.NET 4.6 is a mature framework while ASP.NET Core 1.0 does not contain SignalR or Web Pages (a replacement of Web Form) and has only a subset of full framework.  It would be interesting to see if it becomes a dilemma at all. On the one hand, you have ASP.NET Core 1.0 wants to replace the traditional ASP.NET track, but its framework is a strip-down version and lack of full featured like the ASP.NET 4.6.  On the other hand, you don't want to continue to maintain the traditional ASP.NET track since the cross-platform is the future way of application development.  We look forward to see if the ASP.NET 4.6 is kept alive and continues on its traditional path or to be retired soon.
 
The Unpredictable Microsoft Number
This is not the first time Microsoft has hard time to decide its version number.  For example, all of sudden, Windows 10 was released without going from Windows 9.  How about ASP.NET vNext?  Have you heard it before? I heard and was informed by the Microsoft team during a dev conference.  It was two years ago, about May of 2014, the vNext was suppose to be the next generation of .NET with a design mindset of unifying existing Microsoft technologies like ASP.NET MVC and Web API and hopefully be cross platform.  Anyways, Microsoft quietly dropped "vNext" and renamed to ASP.NET 5.  Well, I guess when everyone cannot agree on things, it's time to hit the reset button.  I believed that ASP.NET Core 1.0 will be permanent and stay with us for a long time.
 
Here are the details of the changes:
  • ASP.NET 5 is now ASP.NET Core 1.0
  • .NET Core 5 is now .NET Core 1.0
  • Entity Framework 7 is now Entity Framework Core 1.0 
0 Comments

    Arthur Wang

    @ArthurWangLA
    MCSD App Builder
    MCSD Web Applications
    ​Member of Windows Insider Program & HoloLens Developer Community & Dev Center Insider Program

    Over 17+ years of  experience in web-based software development & management.  Specialized in Microsoft technology with c# language and its architecture design.  MCSD, MCSE, Microsoft Specialist, MCP + Internet, and B.S. from UCLA

    Archives

    August 2018
    March 2018
    January 2017
    December 2016
    May 2016
    April 2016
    March 2016
    February 2016
    April 2014

    Categories

    All
    API
    Arduino
    ASP.NET
    Cognitive
    CSS
    Database
    Deep Learning
    DevOps
    Electronics
    Flexbox
    HTML5
    IoT
    Katana
    Machine Learning
    Management
    .NET
    .NET Core
    Neural Network
    OWIN
    Programming
    Programming Tools
    Recognition
    Security
    SQL Server
    UWP
    Visual Studio
    Web API
    Web Developer

    RSS Feed

    Latest Articles

© 2014-2020 ArthurWiz.com All Rights reserved. | Home | About |
Protected by Copyscape