Could be worse. You could have done a good job, and get punished for it, because you're going against the status-quo. My current projects does that and it sucks. Oh well, it's just one project. It either ends next year or I quit the job.
KBZX5000
Posts
-
Sympathy, or smack me -
PythonI sometimes do R&D projects (from the corporate side) with Universities. Even though they suck at python, they still insist on using it most of the time. I don't mind it, python is decent enough, but I'm not installing a crappy distro on a VM and sinking hours into setting it up So I default to the Linux Subsystem for Windows. It works and it's perfect if you don't care for X11 or Wayland.
-
Must....not....kill....This is were the Warhammer 40K universe gets it right, I think. They lobotomize and reeducate people like that. Clean, cost effective, ecological. I've pitched a similar idea, but marketing disagrees every time I propose to stab people. The numbers check out though; the reduction in technical debt and project costs is significant, even if you reallocate resource to pay for the funerals.
-
The march toward UWP/Core ?In tech, you have 2 types of evolutions. The slow paced evolutions that move at a glacial pace and but also effect the entire industry. The fast paced novelty gimmicks mostly used to distract us, so we don't get bored waiting for the glacial shifts to produce results. .NET Core is the glacially slow type. Initially, around 2008 I think(?), it's purpose was to get embedded devices to run C# and compete directly with embedded Java platforms. Shifts in the embedded market (raspberry PI for low-cost and the crushing dominance of Java for anything else) altered those plans over time, and Core was eventually repurposed as a cross platform CLR. After waiting what seemed like an eternity, we eventually got a useable version with the release of 2.0. The upcoming WinForms and WPF support is hype / a pitch designed to increase enterprise adoption rate; it's mostly a distraction. Once the Mono team gets Core running on WASM, we'll probably get an alternative UI stack that will eventually end up dominating the market. Probably HTML based? Maybe XAML? We'll have to wait and see. I personally hope for XAML, but anything XML based is fine really.
-
Assumtion is the mother of all fuckupsHa! I'm in the same boat. I need to scrap an "obsolete" database-listener service. And they also want to retain all the functionality, like performing manual edits in the database should still trigger various workloads. *sigh*
-
Code FirstWe always do code first, but we also follow a very strict and sane DB design principle. "Everything is flat by design and we optimize for lowest complexity." Juniors that optimize for "This is going to be faster and/or more efficient" are getting stabbed with a knife by me personally. People prefer DB first, because they like to decide what the data should look like.. which is also fundamentally wrong. Data already has a defined structure when it gets into your system, and has a defined structure when it comes out. All you need to do is flatten it in a safe way, so you can persistently store it, when it's in between those 2 states. The worst of the bunch are Enterprise folks that go on and on about Business Objects or Domain Objects. They're willfully ignoring their core business (= selling a product or service) in favor of making arbitrary decisions about things they've invented themselves. We call that "having a god complex", because it's fun playing god and creating stuff that people will build for you. I totally get that. But they should do that in Civ5 or something, not at work.
-
Wedding Bands?It depends on upbringing and culture. In West EU, expensive rings are commonly frowned upon. In part because of the bad press associated with the diamond trade (= all diamonds are conflict-diamonds by default) and in part as a rejection to the American movie ideal of getting on your knees and presenting an expensive ring (= imitating movies is considered a sign of being immature or insincere) As a result, anything goes as long as it has personal significance. The majority of people opt for a plain ring or one with fake stones in the $20-$100 range. Not a lot of people actual wear them in public.
-
The Future of RazorIf you're going to update your skills, do neural networks instead, and ignore the front-end for one more year. Browser tech and UI in general has been stabilizing and synching up ever since HTML5. WASM is expected to unify the entire stack into a single dominant language. The milestone we need to reach for that to happen, is a workable garbage collector in WASM. Meanwhile, everyone and their cat are preparing to fight for stack dominance. Rust and Qt are already actively going for it, but they have a relatively small following with limited tooling. C# is currently hitching a free ride on Mono's GC implementation, which is no-where near stable enough, but good enough to build exploratory tooling with like Blazor. Meanwhile, .NET Core team is porting everything UI related and Enterprise-legacy related, shoring up for the paradigm shift that's inevitable. I'm guessing Oracle will fuck up during all of this (hey, it's what they do) and miss the opportunity to solidify Java as the dominant language. If we're lucky, we get a full C# stack by 2020, ending this era of language-war nonsense. I don't care who wins it, as long as we unify the stack in the progress.
-
2D Gaming Development / CocosSharpMmm. This project got turned open-source a month ago, with no corporate support. I'm all for open-source projects, but this is like MonoGame with less features, less developers, a smaller community, no money, and a slightly better open-source license. How will this ever survive? Is there a crowdfunding campaign funding it or something?
-
2D Gaming Development / CocosSharpSorry to jump on the cheerleading band wagon, but there's no realistic C# alternative in existence. We needed a 2D engine for low-cost (so C#) tech-demo's, we considered Cocos (dubious support and future), MonoGame (slow updates, no web support), SDL (too much boilerplate needed) and Unreal (unsupported C# plugin). We really tried not using Unity, but even with it's terrible bug support, terrible IDE design, terrible cloud support and terrible asset store.. it's still the best tool for the job.
-
How do you decide...Quote:
Do you treat your laptop like a (lease) car and get a new model every 3 years, or do you keep it until end of life?
Isn't end of life 2 years?
-
On the topic of conscious AIIt's kinda cool that your daughter got scored on her desire to "keep at it". That's an interesting metric, tbh. As an anecdote, I can share you my experience with low-IQ people coming up with puzzles: They are derivative in form and they're often incomplete, with multiple fitting solutions, and no indication of which solution will be considered "correct". A good riddle is therefor a complex pattern: it follows a structured set-up, has a clue to identify the correct answer, and has an elimination factor to exclude wrong results. By simply reading those properties, you have inevitably gained intelligence, because you automatically mesh your notion of a riddle with the properties I present. You might accept them or reject them, but you're bound by the conclusions you draw, recalling them partially the next time you have to come up with a riddle. This is the acquisition of intelligence. You digest, analyze, consolidate, repeat, forget the details. The higher your intelligence, the more patterns you can combine and reproduce with a measure of success, potentially opening up the way to more interactions and more patterns. In contrast, the more you revert to immediate self-gratification, the less complex everything becomes, which generally reduces the variance of the interactions you'll have, and the more dumb you become. Smart people who suddenly decide to watch TV all day and never go out again, don't stay smart. We're not machines that suddenly stop working, but we do deteriorate gradually over time.
-
I went to university for this, really?In uni, I learned: - how to build a complex 3D engine in an esoteric programming language nobody uses (= no internet resources) - how to assess the complexity of algorithms, theoretically, and measure their actual complexity in the field - an esoteric variant of opcodes, not x86 based, to drive a CPU in an emulated environment - how to build data structures, with a comparable quality to the C++ STL ..and that's just the stuff I remember on the top of my head. In general, I felt the quality of my Uni was good. Totally worth the 1.5K euros a year. Well, maybe a bit less; if you resell your books, it's more like 0.7K a year. :)
-
A common bug-inducing pattern: building a grid out of linked nodes.For pathfinding, I just draw a bitmap in memory, at the lowest acceptable resolution, put all the obstacles on it, and draw a modified version of A* on top of that. Then I collect the result, and *bam* a path in < 100ms. The result usually gets stored on disk (as bitmap), which makes debugging waaaaay easier. This probably sounds god-awful for most people, but it's a really simple solution to a complex problem. Easy to maintain, easy to debug, easy to tweak on the fly (by modifying brush thickness of various obstacles!). Also, it avoids the common rookie mistake: trying to build an efficient solution to a problem you haven't solved yet. First you build the easiest possible solution, with considerations made towards debugging / testing / maintenance. Then you profile your resource consumption, so you know for a fact what's slow and what's not. After that, you refactor until you run out of time or budget to do so. EZPZ
-
On the topic of conscious AIYou can only recognize patterns if you have been exposed to them. This means you always need a precursor before intelligence can be established. I believe the exchange of ideas is the necessary initial precursor If the wise old monk has never moved, he can't be wise. He needs to exchange ideas before he can become wise. Once he has intelligence, inaction will diminish it. We currently score intelligence by testing the adoption and retention rate of measurable patterns. Someone who enjoys looking for patterns is considered smart. Someone who does not enjoy looking for patterns is considered dumb. So wait, does that mean I can become super smart, just reading / researching patterns? Yes. Exactly that. By any conventional measurement standard we possess, that's the thing that makes you smart. The first problem with that is that most of our patterns are assumed to be self-evident so they're never really written down in one place. Second problem is that people jealously guard the patterns they know because it gives them a measurement of power. You could call them trade secrets, but to me that seems to imply the information is somehow complex, which it often isn't.
-
On the topic of conscious AIQuote:
Ask a 5 year old girl WHY she thinks the flower prettiest to her is pretty, and she can tell you.
Have you ever done this? Actual ask a child "why"? When I ask a 5 year old girl why she likes something, the answer is a stunted and confused mess, which means she's making up the answer on the spot. Fibbing is a nifty function of our brain, which generates most of our "on the fly" thinking. 5 year olds still suck at it, so it's easy to spot while it's happening. But, that's not consciousness, that's just real-time retrieval of deep-stored memory, mashed through a lexicon and syntax parser.
-
On the topic of conscious AIHave you noticed that during meditation, you're mostly suppressing your though loop and enforcing minor sensory deprivation, until the residual brain activity becomes the primary thought loop? I honestly don't recommend it. I recommend segmentation: use a completely different set of skills, thoughts and feelings for a part of the day. That way, the other neurons can get some R&R, and you're emotions have more trigger moments, which helps to keep the chemical balance. Thought cycles have an intrinsic value. If I did nothing with them, it would feel like a waste.
-
On the topic of conscious AII define intelligence as: - an emergent behavior that occurs when a group of self-sustaining pattern engines successfully exchange ideas over an extended period of time. Feelings are tools; they short-circuit our thought process with previously established follow-up actions. It saves time and stops our neurons from getting overly exerted, mostly, but as a side effect it also makes our thought process more rigid. Fun side note: I really hate Mensa. They kept stalking me for years, trying to sucker me into joining their retarded little club. They tell me I'm smart and yet they treat me like an idiot. I'm not paying anyone who wastes my time.
-
Blockchain: Next tech juggernaut?Well, the hype cycle is already going down, so maybe in a year or so we'll have sensible applications. :laugh:
-
On the topic of conscious AII think people make small improvements by combining 2 known concepts at a time. Perhaps our ancestors once tried stabbing the fire with a pointy stick, and ended up with a torch.