As I personally prefer to develop using C# and targeting the .NET Framework, I was wondering about other opinions. Usually I simply just target the minimum framework that makes sense for my project, and don't worry too much about it, since I am developing for the end-user, and my personal belief is that it's not a big deal to have someone update their system to a modern framework. Now though, I am developing for an API that can be used for other developers, and will require to be distributed with their projects, so I feel like it's important to get the view of others.
In the past few years, the .NET Framework, or more specifically, the development for it have drastically increased. A typical Windows XP machine will have version 2.0 installed on it, so that's basically the minimum version used in active development. That's not to say that it will be the highest version on an XP machine, only that it is what will be on it by default, if the user has decided not to keep up on their updates. Windows Vista comes with version 3.0, and Windows 7 comes with version 3.5.
.NET Framework Version 4.0 is not installed on any machine by default, but it is a common update. This is my personal version of choice, as it has a lot of nice added features to make development easier, and is not uncommon on any machine. Version 4.5 came out with Windows 8, but I doubt I will be targeting it any time soon, as it is pretty much only common on Windows 8 PC's, which yeah, we all know by now the story on that...
More or less, I am just curious. I would hate to stick to only using version 2.0 to cater to the die-hard XP users who refuse to install updates, and not be able to develop how I want, but I don't want to develop an API that other developers won't use because it MIGHT make the end-user install an update first.
If you could share your thoughts, it would be greatly appreciated and help guide my development.