To some extent I agree -- the challenges (certainly for me when I was picking up MFC/ATL) were a lot more to do with actually getting stuff done, picking up slightly weird nuances. Mind you, a lot of my learning was to do with programming as a whole, so the chances of me picking up C# any quicker when I was starting out was probably less. But I agree, part of the fun was very much actually seeing stuff come together, and understanding why. This (to some extent) is now no longer relevant, coherent libraries that work. Admittedly there's still things to watch out for, memory management is a whole different ball game, but to my mind it is essentially simpler rather than it just being quicker to get to the end product. However I still feel there's value in understanding what's going on under the hood - particularly when troubleshooting - despite the fact that you're not using it anywhere near as directly. I think where things have changed is its now application of techniques/technologies which sets people apart. Their knowledge of development patterns, good code writing practices (paragraphing, good commenting etc.), refactoring. Its no longer a question of tackling problems, but how effectively can you tackle the problem. Some of the biggest gains in terms of development that I personally have experienced (beyond my initial learning of libraries/languages) has been the adoption of practices like test-driven development, setting up continuous integration and getting custom reporting in. I don't agree that there's no longer a wide spectrum of abilities within the .NET space. Fair enough, it's unlikely to be _as_ wide as with C++, but that's essentially because the range of things on which you can vary in knowledge is smaller. There were just far more things (certainly from my exposure to C++) you could shoot yourself in the foot with. However, I'm yet to find a time when I personally don't read other things from other developers, and fail to learn something, and occasionally be blown away -- the various .NET blogs and articles I read are testament to the fact there are still extremely, extremely bright people in the .NET world and are as important as the hardcore C++-ers. Both of which appear to exist in very large numbers on this website :) -- Paul MS Messenger: paul -at- oobaloo-co-uk
Paul Ingles
Posts
-
Has .NET destroyed the Programmer's spark? -
UI Frameworks/ToolkitsHi, apologies if this question is a little too programming centric, but if it is please let me know. I'm putting together a very lightweight monitoring program that will be used to display information from our build/continuous integration server on a large monitor in the office. The aim is to give info about improvements in code coverage, number of acceptance tests passing etc. At the moment it just uses a very basic Windows Form which monitors the server, updates a label with the text needed (at the moment just whether the build succeeded or failed). This original effort was always intended to be a very short-term solution, and doesn't seem to be a nice way at all and so I'm now trying to find something that will get me up and running with a minimum of fuss. Since all data is stored in XML I did have a look for an HTML view that would be able to apply a stylesheet and generate an HTML representation. I've also looked briefly at MyXaml, but without really playing around with it I don't know whether it's really suited for what I'm after. So, I'm after any suggestions. The only requirements being that it needs to be .NET, and free/open-source. Is this the kind of thing MyXaml is intended for? Are there any other things I should be looking at? For instance, any kind of vector drawing type toolkits that would let me draw text and very simple lines etc.? -- Paul MS Messenger: paul -at- oobaloo-co-uk
-
Software Shop EquipmentWell, if you don't want to use Windows Server for VPN, there is an open source project called OpenVPN ()[^]that I've used to connect my Dad and my Dad's boss' computers together. It worked ok, although with dynamic IPs it was a little problematic. Marc mentioned earlier that early on he felt automated builds weren't necessary. I would disagree and say that you're better off getting the process in place as soon as possible. We use NAnt for building, and setting it up retrospectively for projects has been far more complicated than doing it from the get-go. In particular, I would suggest putting the build file actually inside the solution, and then produce a build file stub on the build server. That way you get versioning of the build file, and it's also far easier to make changes than having to edit the file directly. Other build-server tips, I would definitely recommend using a continuous integration approach rather than scheduled builds -- i.e. listen for changes rather than always try building at one point. This ensures that changes are caught as they go in. I've not used Vault (but hear it's extremely good), if it supports atomic commits it should make it unbelievably easy to ensure that a unit of work doesn't negatively impact elsewhere. You'll also want to get automated NUnit and NCover reporting in too, and potentially FxCop. With CruiseControl.NET as your CI server, you'll get a good web app for monitoring builds, and a tray app that can monitor and update you directly. One other thing you'll probably want to do is to get NAnt building deployables. This in itself has been a huge timesaver for us, although this probably depends on what you're building. We build web-based apps, so our build scripts automatically pull down the latest code, build, test, and then zip the web site (including all dependent binaries etc.) into a zip that can just be extracted onto the server. The filename is along the lines of
ProjectName-buildNumber-buildTime.zip
and is stored in a shared folder. That way, we have a complete archive of all builds. We also build a source zipfile containing the source tree used for that build. Although not strictly necessary with SCM, it does make it very easy when identifying what code is currently on production etc. Oh, and another thing (sorry -- I appreciate this is probably over long already), you'll also want to put tools such as NAnt, NUni -
More unit testing questions....One thing I'd also suggest is taking a look at using Mock objects, they provide a nice way of letting you test interaction between objects that you may or may not be able to test 'reliably'. For instance, say part of your code needs to send emails out, but there's no guarantee that your email server will be alive/accessible to all developers when they're running the unit tests, or to the build/integration server. So, you build your EmailSender type around an interface that you can then call Send upon in your code, with mocks you can dynamically create objects that implement the interface, you can then tell your own code to use the interface on the mock to send the email, and you then test that the mock received the intended call, with the expected parameters etc. It also makes your unit tests nice and quick :) There are a few mock frameworks around for .NET (I don't know what language/libraries you're using), but I'm sure if you were interested specialists in other areas would be able to suggest something. For .NET, check out: http://www.nmock.org/[^] http://msdn.microsoft.com/msdnmag/issues/04/10/NMock/default.aspx[^] and for a more general treatment: http://www.martinfowler.com/articles/mocksArentStubs.html[^] Forgot to mention that although mocks are very useful in the example I mentioned, they also serve a nice way of testing interaction between even non-expensive/trivial types. -- Paul MS Messenger: paul -at- oobaloo-co-uk
-
How to use non .NET DLLs in .NET environment?As mentioned in the previous post, you can't add a reference to the DLL since it's neither a .NET assembly, nor exposing COM interfaces -- when you add a reference to a COM DLL VS.NET just generates a .NET P/Invoke wrapper around it so you can use it without declaring things yourself. Again, you can either go down the route of writing DllImport & P/Invoke stubs (although if you're using reasonably complicated data structures this can be time consuming unless someone else has already converted them). Alternatively, you can use the (rather cool in my opinion) It Just Works functionality within MC++. Effectively you write a .NET wrapper around the functionality exposed by the Intel libraries by linking etc. as you would with MSVC++6. Now, I'm a little rusty on this, I've not done any MC++/interop code for a good while now, but it should be relatively straightforward. Effectively you compile your assembly with the /CLR compiler switch, this in turn generates managed code, including the P/Invoke code that would otherwise be up to you to write in C#. Of course, you'll need to make sure you link with the libraries, and include the headers. If you get stuck with that I'm sure people will be able to give you more detailed instructions. -- Paul MS Messenger: paul -at- oobaloo-co-uk
-
Visual Studio.NET Beta - before you install it...Don't forget this is largely to clean up from previous betas which weren't intended to be as high quality. The last VS.NET 2005 I installed was one of the CTPs, which was never intended to be 'uninstallable'. In any case, I'm still waiting for it to download... still, should be finished by the end of today. -- Paul MS Messenger: paul -at- oobaloo-co-uk
-
[Article] Is Redmond losing touch with its developers?Don't forget that it's only the kind of collaborative tools that aren't included within the subscription. You still have access to the various testing/analysis type tools included within VS. From a developer point-of-view, the only thing (to my knowledge) that would be of use to see is the kind of work item tracking. That is, where do your tasks get listed, and how do you then action them etc. However, because of the extensible/generic nature of the processes included within Team System it sounds to me like even if you did have exposure to Team System, you still have to get to grips with the company's processes, rather than Team System per se. I personally don't see how it does shut out the independent/small developer out, unless of course it's not possible to obtain development licensed versions of the suite so that you could develop extensions to the Foundation Server etc. For example, I believe SourceGear are working on a *nix client to TS. It would be interesting to see whether small/micro ISVs could get copies of TS to develop against, but without licenses for productive use (i.e. becoming your method of tracking projects). I'm sure someone will tell me if I'm wrong, but I would assume attractiveness in the workplace is down to more than just exposure to the latest tools. Sure it's a bonus if someone has already used ClearCase (for instance), but if a candidate hadn't I doubt it would be held against them -- using the tools is only a small part of getting up to speed with SCM. I initially was quite disappointed that TS would be outside of the Universal subscription (and priced quite so highly) but after reading other responses from devs, I do now feel that the position is being somewhat blown out of proportion. I'm just pysched about what's coming up in .NET 2, ASP.NET 2, and VS2005 to worry that I'm not getting Work Item tracking and SCM shelving, great as they may be. -- Paul MS Messenger: paul -at- oobaloo-co-uk
-
[Article] Is Redmond losing touch with its developers?I'd echo Michael's response. I've used a couple of Java IDEs over the years (I've only done a very minor amount of J2EE coding), and I've always found them drastically under-designed compared to VS. Although, I've not tried IntelliJ and that always seems to be the favourite. I can understand the whole VB petition to some extent, largely from understanding the uproar when Windows 2000's lifecycle support was changed (I think) in response to large companies being unwilling to deploy XP having only just deployed 2000. However, saying that, thats only from a maintenance stand-point. I don't understand why people would necessarily still want to target unmanaged Win32 code using VB. Especially since its not _actually_ native code in the first place. I guess the gripe is with the continued support of Visual C++ for writing Win32 apps, but not VB. Now, I know this is likely the wrong place to bring it up :P, but there is almost certainly a vast amount of legacy VB code out there in enterprises, still being maintained (I've had to do some myself), and without any good way to bring VB up to VB.NET (this is ignoring the issue of bringing VB6 developers into the OO world) they're likely to feel left without somewhere to turn. Whereas despite the diminished pushing of VC++ from MSDN etc., the support is still there, with compiler and IDE improvements being made too. I can also understand the gripes with MSDN Universal customers, we have MSDN Universal subscriptions here and I personally was looking forward to playing around with it. However, we're a relatively small shop, and already have a large amount of the tooling in place -- a custom bug report/work tracking app, CruiseControl.NET, NAnt, NUnit etc. so I think the only thing we could really gain from is dropping SourceSafe. However, if the new version is significantly better we may be able to keep that, otherwise it looks like Subversion may be the way to go. I understand that people originally consider MSDN Universal to essentially be an all-you-can-eat type sub, but these kinds of tools were never in the arsenal, and if I'm not mistaken, the majority of the things included within the subscription are dev/testing licensed, _not_ licensed for every day use (excluding VS). So, although it maybe ought to be included for devs to produce add-ons etc., it probably wouldn't have ended up as an everyday use license anyway -- it's no longer Visual Studio we're dealing with, rather a kind of development Office productivity suite on steroids. Not sure I've made myself too
-
Hosting Plans for Whidbey?I don't know of any, but I'm also pretty sure that until Beta 2 of Whidbey is released there are unlikely to be any -- I'm sure someone will tell me if I'm wrong but I think that one of the restrictions within the EULA for pre-Beta 2 releases is that the .NET Framework 2 isn't licensed for production, so any hosting companies (in my opinion) would be prevented from offering such a plan. However, when beta 2 is released, I seem to remember reading that these restrictions would be lifted, allowing companies to put beta code into production. Of course, I would assume that hosting companies would be wary of deploying beta code, particularly to non-enterprise/free customers. -- Paul "Put the key of despair into the lock of apathy. Turn the knob of mediocrity slowly and open the gates of despondency - welcome to a day in the average office." - David Brent, from "The Office" MS Messenger: paul@oobaloo.co.uk Download my PGP public key
-
Bloggers and CV'sIf I had a blog (and I've been meaning to get into it) I would imagine it would be useful. On my CV I included links to my own site (whilst it still had content), and included a link to the my articles section here too. -- Paul "Put the key of despair into the lock of apathy. Turn the knob of mediocrity slowly and open the gates of despondency - welcome to a day in the average office." - David Brent, from "The Office" MS Messenger: paul@oobaloo.co.uk Download my PGP public key
-
Code CompleteYep, I think it was actually one of the first programming related books I bought. It was the old first edition with the kind of green/grey/brown cover. It eventually got so dog-eared that when I heard a 2nd edition was available I bought that too. -- Paul "Put the key of despair into the lock of apathy. Turn the knob of mediocrity slowly and open the gates of despondency - welcome to a day in the average office." - David Brent, from "The Office" MS Messenger: paul@oobaloo.co.uk Download my PGP public key
-
articlesI would definitely think some kind of area that keeps articles in the equivalent of pre-announcement purgatory would be a good way to go. peterchen wrote: Finding enough Tutor-Types that refrain from turning everything into "it should be done how I would have done it" I think some kind of voting system would be the best way -- i.e. provided it gets 2 yay votes etc., and people are free to recommend whatever at the bottom. I don't know though, you never know how these kind of social interactions occur until you actually put in place with people, but some kind of self-regulating system would be an interesting experiment. Anybody think its worth posting as a suggestion, perhaps to see if it could be put in on a trial-like basis as an attempt to limit the cruft that gets posted? :) -- Paul "Put the key of despair into the lock of apathy. Turn the knob of mediocrity slowly and open the gates of despondency - welcome to a day in the average office." - David Brent, from "The Office" MS Messenger: paul@oobaloo.co.uk Download my PGP public key
-
articlesI like the idea of having a kind of preliminary review stage, wouldn't be a bad way of getting the article quality up. How about having some kind of rotating system where people could volunteer to be involved as reviewers: they receive an email about an article in their given area of knowledge/expertise that's just been posted. They then maybe get to view it on a temporary page, post comments (as with current articles) and maybe get to vote on whether it should be approved. I personally would be happy to volunteer and perform some kind of article review, and would be happy (when posting) to have my articles subject to a peer review, if anything, it could prove to be extremely beneficial from my point-of-view as an author. By ensuring that areas have a suitably high number of volunteer editor/approvers, it would take the pressure of individuals from spending an extremely large amount of time reviewing everything. -- Paul "Put the key of despair into the lock of apathy. Turn the knob of mediocrity slowly and open the gates of despondency - welcome to a day in the average office." - David Brent, from "The Office" MS Messenger: paul@oobaloo.co.uk Download my PGP public key
-
Code TestingWell, if you're using .NET then you're more than likely being asked to write some testing code using a framework such as NUnit or MbUnit -- allowing you to test your actual implementation works as it should. You may want to read up on black box testing best practices to ensure that your tests adequately test your code. The review will probably therefore be examining not only the resulting implementation that provided some functionality/feature, but also the test that ensures the code is doing what it should (and that it fails when it should fail etc.). Extreme Programming really is more to do with process than anything, and unit testing/test-driven/test-first development is just a part of it. XP encourages the writing of tests before the actual code, effectively it should be test > build > fail > code > test > build .. test > pass. Refactoring is another important part to ensure that the internal code quality is as high as possible -- whether you do that at the end of each pass cycle, or at the beginning of each code cycle is probably not massively important. Just that you identify areas where code could be refactored (known as smelly code) to improve quality -- such as large bits of copy/pasted code that could be extracted into a method etc. -- Paul "Put the key of despair into the lock of apathy. Turn the knob of mediocrity slowly and open the gates of despondency - welcome to a day in the average office." - David Brent, from "The Office" MS Messenger: paul@oobaloo.co.uk Download my PGP public key
-
Read data directly from harddriveIt's not accessing the hard drive directly, but you could call functions from the Win32 API. The page below covers the functions you can use. If you go to http://www.pinvoke.net/ you ought to be able to find the P/Invoke signatures to let you use them. Alternatively, you could created a class wrapper in Managed C++ that called the functions directly, and then consume that within your C# app. http://jan.netcomp.monash.edu.au/ssw/files/win32.html[^] -- Paul "Put the key of despair into the lock of apathy. Turn the knob of mediocrity slowly and open the gates of despondency - welcome to a day in the average office." - David Brent, from "The Office" MS Messenger: paul@oobaloo.co.uk Download my PGP public key
-
Gentle.NET and Cascade DeletesI've not got any experience with Gentle.NET, have you considered NHibernate? It's still in Alpha but I believe that what it does provide is pretty stable and reasonably well tested (i.e. production quality), and I also believe it supports cascading deletes. As for a default mapping of commands to stored procedures, I don't know. You may also want to take a look at iBATIS.NET. -- Paul "Put the key of despair into the lock of apathy. Turn the knob of mediocrity slowly and open the gates of despondency - welcome to a day in the average office." - David Brent, from "The Office" MS Messenger: paul@oobaloo.co.uk Download my PGP public key
-
Creating Interop WrapperYou'll have to expose the VB code as an ActiveX DLL, enabling you to use .NET's COM interop support. Here's an MSDN article that talks you through it: http://msdn.microsoft.com/library/default.asp?url=/library/en-us/csref/html/vcwlkCOMInteropPart1CClientTutorial.asp[^] You'll need to scroll down a bit, to where it says Example 1: Using Tlbimp. This imports the type library, creating a managed gateway to the COM interfaces implemented by the server. All you have to do then is ensure you reference the generated assembly when compiling. Alternatively, (if you want) you can actually declare the interfaces and GUIDs inside C# through applying attributes. Again, this is documented inside the MSDN article. However, if you've got a large interface you will probably prefer the automated approach. For more information about .NET's interoperability, you'll definitely want to check out this section in MSDN: http://msdn.microsoft.com/library/default.asp?url=/library/en-us/cpguide/html/cpconinteroperatingwithunmanagedcode.asp[^] It covers things like threading, marshaling of types, lifecycle etc. -- Paul "Put the key of despair into the lock of apathy. Turn the knob of mediocrity slowly and open the gates of despondency - welcome to a day in the average office." - David Brent, from "The Office" MS Messenger: paul@oobaloo.co.uk Download my PGP public key
-
Headphone recommendationsOk, I'm not too sure how much I can recommend... although that site does have quite a massive amount of info, you may be better posting a message in the Headphone section asking for suggestions. If you give the same info as here you should get a few. Wearing glasses at the same time may cause a few comfort problems, I've tried a few different headphones over the years but because I don't wear glasses I don't know whether they'd become uncomfortable as a result. My immediate thoughts would be to suggest any Grado headphones within your price range (SR225's are well regarded: http://www.headphone.com/layout.php?topicID=3&subTopicID=26&productID=0020090225[^]). But that may be overkill without a good source so you may also go with the SR125 or SR80's. They're open so if you're in a noisy environment they may not be great, but they're quite light and the foam doesn't get too hot. From what I've read (and from use -- I used to have a pair of their flagship RS-1s) they suit the music you describe. For ones that isolate you a bit more, I believe the AKG K271s are also a good choice (http://www.headphone.com/layout.php?topicID=13&subTopicID=71&productID=0020120271[^]). As are the HFI-650s (http://www.headphone.com/layout.php?topicID=13&subTopicID=71&productID=0020360650[^]). As I said, those are my first thoughts. I've owned Grado's before (and they are excellent, they don't need a lot of power), the only reason I got rid of my RS-1s was because I had another pair and couldn't afford to keep them. One other thought, if you're unsure you may be better buying a pair second hand (Head-fi has a classifieds bit) where you should get a pretty good deal on them and it means if you don't like them you can sell them on again for little loss, its a good way of trying out a few pairs and see which you prefer. Hope that helps. -- Paul
-
Headphone recommendationsWell, I can suggest a great place to go and read (http://www.head-fi.org/forums/[^]) that place is an absolute gold mine of information. The first biggest question is over where you'll be using them, and consequently how isolated they need to be. For instance, in a relatively noisy environment at work you'll need to look at some closed phones to prevent noise leaking into you, but also prevent what you're listening to disturb other people. Otherwise, you can go open, and accept that noise will leak. The other big question is budget, you can buy headphones that cost thousands and thousands of dollars (do a search for Sennheiser Orpheus, a system that sold for over $30,000). Finally, the other big question is not only what kind of music you listen to, but what is providing the music -- portable CD player, hi-fi etc. If you're using a low power device (such as an iPod) you're going to struggle with certain headphones that require a little too much juice, unless you go with an amplifier also. My best suggestion is check out Head-fi, read reviews. Feel free to let me know more about how you're going to be using them and I'll do my best to suggest stuff. -- Paul "Put the key of despair into the lock of apathy. Turn the knob of mediocrity slowly and open the gates of despondency - welcome to a day in the average office." - David Brent, from "The Office" MS Messenger: paul@oobaloo.co.uk Download my PGP public key
-
IE Toolbars on Windows Taskbar?Great, thanks. Now I just have to set aside some time tonight and give it a go. -- Paul "Put the key of despair into the lock of apathy. Turn the knob of mediocrity slowly and open the gates of despondency - welcome to a day in the average office." - David Brent, from "The Office" MS Messenger: paul@oobaloo.co.uk Download my PGP public key