private static Assembly LoadAssemblyByName(string name)
{
var myPreciousAssemblyLocation = Path.Combine(Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location), name);
using (var fs = new FileStream(myPreciousAssemblyLocation, FileMode.Open, FileAccess.Read))
{
var data = new byte[fs.Length];
fs.Read(data, 0, data.Length);
fs.Close();
var assembly = Assembly.Load(data);
return assembly;
}
}
static void Main()
{
var type1 = typeof (MyPrecious);
var myLibraryAssembly = LoadAssemblyByName("MyLibrary.dll");
var type2 = myLibraryAssembly.GetType("MyLibrary.MyPrecious", true);
CompareTypes(type1, type2);
}
The code above compares type1 from referenced project to type2 from the same assembly, but loaded again through Assembly.Load(byte[]). That makes the library loaded twice in the AppDomain. Now when a call to AppDomain.CurrentDomain.GetAssemblies() is made, the assemblies are:
System.InvalidCastException: [A]MyLibrary.MyPrecious cannot be cast to [B]MyLibrary.MyPrecious. Type A originates from ‘MyLibrary, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null’ in the context ‘LoadNeither’ in a byte array. Type B originates from ‘MyLibrary, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null’ in the context ‘Default’ at location ‘c:\users\pwdev\documents\visual studio 2015\Projects\ConsoleApplication1\ConsoleApplication1\bin\Debug\MyLibrary.dll’. at ConsoleApplication1.Program.Main() in c:\users\pwdev\documents\visual studio 2015\Projects\ConsoleApplication1\ConsoleApplication1\Program.cs:line 49
Explanation
Yes, it’s all about the load contexts. There are three different assembly load contexts: Load, LoadFrom, Neither. Usually there is no need to load the same library twice and get the strange behavior written above, but sometimes there might be. There are many advantages and disadvantages of using different Assembly.Load(From/File) methods. Take a look: Choosing a Binding Context. Furthermore, consider what’s happening to assembly dependencies when you load an assembly. There are best practices described on MDSN for loading assemblies: Best Practices for Assembly Loading. I have to say, in my whole career I’ve been loading assemblies by hand twice, and from time perspective, both two cases were wrong.
TypeHandle
Instead of comparing the types in the examples above by == operator, there is a possibility to compare them by the TypeHandle:
TypeHandle encapsulates a pointer to an internal data structure that represents the type. This handle is unique during the process lifetime. The handle is valid only in the application domain in which it was obtained.
Source: MDSN. Well, I can’t think of an interesting usage for the TypeHandles for now, but it’s good to know.
How much does an average Amazon Software Developer Engineer make, according to GlassDoor statistics? How does it compare with cost of living? Statistics for 26+ locations around the world!
Bob is a very successful guy. He is auto scaling his service by automatically adding hosts when the CPU increases, and he is removing them when CPU goes down. Dear Bob, there is a trap waiting for you around the corner.
Monitoring services is crucial, if you care about the application uptime. There are hundreds if not thousands parameters which you can (and should) monitor, related to CPU, network, hosts, application and so on. What are they? What are the non-obvious choices?
It seems that most people know the importance of software design patterns, best practices or continuous integration. While those subjects are important, there is one more equally essential term, which yields only one relevant result link on the first Google page. Meet Operational Excellence.
First of all, this post won’t be for people who think developer’s job is to design, write code and test it. It’s far beyond that. One of the important responsibilities is to ship your code to production. How to do that safely?
There are certain classes of exciting problems which are surfaced only in a massively distributed systems. This post will be about one of them. It’s rare, it’s real and if it happens, it will take your system down. The root cause, however, is easy to overlook.
It’s surprising how the volume of data is changing around the world, in the Internet. Who would have thought 10 years ago, that in future a physical experiment will generate 25 petabytes (26 214 400 GB) of data, yearly? Yes, I’m looking at you, LHC. Times are changing, companies are changing. Everyone is designing for scale a tad different. And that’s good, it’s important to design for the right scale.