Luc Pattyn wrote:
eternal loop
- so much more poetic than infinite loop!
Luc Pattyn wrote:
eternal loop
- so much more poetic than infinite loop!
And you stayed around to watch? :omg:
Buried in your response was a gem:
Member 2941392 wrote:
At design time, we reviewed ...
! People (myself included, sadly) often think of code reviews, but fail to think of design reviews - which are probably of equal (but different) value than code reviews. For a review with, say, presenter plus three reviewers, do you have a sense of how much content (design ideas being very different content from (nearly) finished code) could be reviewed in an hour / two hours. Also, what was your experience of the maximum workable duration of a review session, and does it vary with the number of particpants?
Awesome, clear, suggestions!
Jason, What has your experience with code reviews been? While I accept that code reviews can have some benefits, I've found that there are ways that help and ways that don't. One-on-one code reviews are often the most helpful at teaching new concepts to junior developers. There's plenty of room for question and answer, for showing how the design concept turns out in actual real-life code, etc. And it's a 'safe' environment where 'stupid' questions can be asked without embarassment! Team-based code reviews, where everyone scrutinizes a piece of work done by one of their peers, are (in my experience) really not so good as teaching tools - there are too many people in the room, and with people come opinions. Too many opinions leads more to confusion than to learning. I guess with a strong leader conducting the session, its possible that this wouldn't happen - but then team members can end up feeling suppressed and denied being able to fully participate / contribute. Further, such code reviews often get bogged-down in nit picking about coding style, use of comments / use of #region / variable names / adherence to (and value of) coding standards, etc. If teaching new coding ideas in a team setting is the preferred approach, I would recommend something more structured than a code review. (More structured meaning something ilke formal training, but a structured in-house presentation / classroom / workshop is equally useful if the resources are available.) The only time I was in a team-based code review that worked was (years ago) on a real-time project coded in C. Being real-time (without a user interface) it needed to be extremely reliable. The code review was a very appropriate mechanism to get many eyes on the code, looking for holes in the implementation of the algorithm, the error handling, etc. Being C code with no UI, it was also a relatively small body of code - making it feasible to do such a code review. Chris
Have you thought of having the team do a book study: > give each team member a copy of a book that discusses the ideas you want the team to learn > each week, the team reads a chapter (or a smaller or larger chunk of content - whatever is appropriate for the topic at hand) of the book (everyone reading the same chapter, of course!) > each week, the team meets (on company time) to discuss the pros and cons of the idea(s) presented in the chapter. The last step can be very effective if, each week, a different team member is asked to find a situation in their current project where they could apply the idea presented in the book. This provides for a useful discussion of the difficulties and benefits of using the idea; it also helps make the idea concrete for everybody, in a way that is often hard when you're just reading from the book. Sometimes, holding the meeting over lunch, and providing food (pizza and beer soft drinks commonly being the food of choice in the US), is both popular and effective. HTH, Chris
How did you know my beard was gray? :((
KP Lee wrote:
The scary part is that we have humans that write viruses now. What happens when a computer would think it would be cool to do that?
I'm not so sure that a computer writing a virus would be any more scary than a human writing one. But I'm also not sure that we could in any way predict the nature or behavior of an AI that 'arose' (evolved spontaneously). We have this perception of computers being precise, fast, etc. Which they are. But that is only true for our programs (which are created to fulfill extremely explicit purposes). We can tell a computer to process a gazillion polygons to create a photo-realistic image as a scene in a game - and that requires very precise calculation of all the meshes and context information, etc. from which the scene is built. But, a computer-hosted consciousness, while necessarily implemented with that precision will not, IMO, exhibit that same precision: the taxonomy of types in a computer program is necessarily limited (implementation); in any kind of 'real world' (which is the domain of awareness of an AI - whether it is 'our' real world, or the world of software and hardware as observed by an AI whose awareness is limited to the computer and software of its implementation) it is essentially limitless: an observed world does not come with convenient labels defining its constituents. Sure, the algorithms that underly its consciousness will be that precise (implementation) - but so is the chemistry that operates the neurons in our own brains. And it takes a great deal of training and effort for us humans to manage our thoughts and mental models to be able to create computer programs (exhibited precision). Mostly, humans' thought processes are extremely fuzzy - confused by the nature of our brains having two hemispheres that process information in distinctly different fashions (integrative / wholistic on the right; differential / exception detection on the left). Now, it can reasonably be argued that an AI doesn't need the left brain/right brain architecture that human consciousness has - its artificial, right? But I'd say it can equally be argued that we don't really know what it takes for consciousness/self-awareness to arise. It seems that the ability to distinguish 'me' from 'you' (i.e. to tell self from not-self) is crucial for consciousness that resembles human consciousness (this is a left-brain activity). But - is it possible for a viable AI to arise that is analagou
I've used the grid control from DevExpress[^] and like it. AFIK they have a free 30day trial where you could experiment with it to get the appearance you want.
Apparently: "Inspectors were also concerned about the poor physical condition of the prison, litter and graffiti." So - they need a better grade of litter and graffiti? Your every day, common or garden litter and graffiti isn't good enough for Her Majesty's prisons any more? :-D
I'd be happy to settle for the bank vault. Although the top secret military research lab would cool, without a doubt. ;)
In addition to Luc's suggestion of encrypting off a file stream, have you considered compressing before encrypting? Assuming that your images are not JPG (or some other format that is already compressed) you might get a major benefit from compression.
If you're able to edit the code that renders the content you want to clear, perhaps you could adjust the clip rect before the 'content to be cleared' is drawn - thus in effect not drawing it (but leaving the paint method relatively unchanged)?
When I'm interviewing candidates I always (try to) start by asking myself "what skills, attitudes and aptitudes should I expect of the candidate given their experience?" The CV/resume is very useful at this point - it says what they (claim they) have been doing - from which you can make a guess at the skills they've developed: > have they been exposed to written specs and expected to turn them into implementation designs? (If so - ask about what they did/how they did it/how they interacted with peers to ensure that they'd created valid and complete designs, etc.) > have they written a Windows service (for example)? If so, what Win32 or .Net APIs should they be familiar with having done so? > have they created (part of) an app with a heavy UI? Perhaps they've worked with Infragistics, DevExpress or some other UI package. If so, are they as familiar with the parts of the package as you'd expect them to be, given their claims? If they've done .Net UI work, are they familiar with the using
statement (as distinct from the using
directive)? > etc. What technology skills should the candidate have learned while doing what they claimed they did in their last job(s)? Given that the candidate is applying for a programming job there are some fundamental topics a candidate could (IMO) be expected to know: > How do you set up a new project in the IDE? > How do you add references to DLLs/libraries to a project in the IDE? > (possibly) If your project is a DLL (for candidates who have created DLLs/libraries) how do you configure the project to debug it in the IDE? > (in Visual Studio) What's the difference between a project and a solution? What is a post-build step? > (in other IDEs) - ask about how projects are built (does any platform still expose Make to developers???) > (possibly) "what version control system(s) have you used, and what check-in policies did your last employer use?" > (in .Net) How do you write debug information to the output window? (And, if they give a sufficient answer to that: what namespace contains the Debug object?) BUT - what fundamental language and framework/library topics can you expect, regardless of experience? I'd say that there are probably few topics that you can be sure of (regardless of experience) but many that are possible/likely: > what are Exceptions
, and tell me how you have used them? > what is the difference between the for
and the foreach
statements? > tell me
Electron Shepherd wrote:
What's the point of testing software?
Um ... to keep the customer happy? ... because if I don't, I'll feel bad about myself for releasing crappy software? ... so I don't get fired? :^) Surely, all the 'real' reasons (to find the defects / to prove there are no (easily encountered defects) / etc.) really boil down to one of the above. I write software (a) because I enjoy it and (b) because it's my job. If I fail to test my work I'll know I've done a bad job and someone will get unhappy. And I'll either be working late to fix what I should have found and fixed earlier, or I'm polishing up my CV and on the job hunt again. Not that I would be pleased with an interview candidate who (only?) gave such flippant answers to the question: "What's the point of testing software?", of course, because demonstrating respect is something I also look for in interview candidates.
Nah - a set of good speakers, and a synth tuned to the accelerator pedal position + speed of the car will do the job :-\
Microsoft has produced the Managed Extensibility Framework[^] that might be appropriate. It doesn't do anything specific to making pluggable parts of forms, but it does the work of dealing with dependencies on plugins that may or may not be there, etc.
Didn't see the video, so maybe they addressed this, maybe they didn't, but ... octopus sounds like it has a Latin origin, so why do those oh-so-smart scientists seem to think that it's plural should have a Greek origin?
There are many things I like about Resharper: * Refactorings - for example ... > being able to convert an auto-property to a property with a backing field because I now need to add behavior in an accessor > being able to convert an anonymous method call into a lambda expression (or back) - which is very useful when teaching coworkers about lambda expressions > there are many others * Static code analysis result indicators in the right margin - where hints/suggestions/warnings/errors are flagged and clickable. * Ctrl+Click - same as F12, but I don't need to move my hand off my trackball. * Letting me know of possible null values (typically from method return values). There are probably other things I like - but I've used R# for so long now that I've forgotten what's R# and what's native to VS. Other things - like identifying naming convention violations - I don't really care about. I know the house rules and don't need to think about them, so don't need a nanny checking up on me.
Did some research - apparently TFS does not require use of Active Directory for user authentication (see MSDN: Managing Team Foundation Server in a Workgroup[^] ... but I suspect it may be simpler. And, now that I think about it, it wouldn't hurt to learn a bit about setting up Active Directory and a domain controller.