|
Post by atteSmythe on Jun 2, 2011 2:14:53 GMT
A while back, someone recommended hearing Robot in Legion's voice. I disagreed to myself at the time, but the more we see of him lately, the more it fits in my head. Robot is scaring me... He never used to be this intense. Yea, the tone of his voice sounds authoritarian to me. Like he is taking control of the situation. He could just be taking the situation very seriously. Really, in my mind, he almost comes off like a 20's, 30's mobster in that last panel. Well, he's Kat's self-appointed evangelist and guardian. I think this is just an extension of those assumed duties. He's infatuated with her...he's been pretty intense for a while now, IMO.
|
|
|
Post by Max on Jun 2, 2011 2:49:29 GMT
Robot is scaring me... He never used to be this intense. Really? I really didn't read it as that intense. It's hard to tell tone from written word, but I just read it as a simple "[you should] let her work."
|
|
|
Post by crater on Jun 2, 2011 3:57:11 GMT
Robot is scaring me... He never used to be this intense. Really? I really didn't read it as that intense. It's hard to tell tone from written word, but I just read it as a simple "[you should] let her work." His chat bubble is spiky and random. It reminds me of Dream's chat bubble with out the inverted colors. Dream (from Sandman) was an intense and dutiful person.
|
|
|
Post by Max on Jun 2, 2011 6:54:33 GMT
That's how all robots' speech bubbles are rendered. You can see it in the preceding pages.
|
|
Pig_catapult
Full Member
Keeper of the Devilkitty
Posts: 171
|
Post by Pig_catapult on Jun 2, 2011 8:49:31 GMT
Something that's really been bugging me is that. . . well. . . a good computing language is easy to decipher and comprehend. It being so big and hairy and complex as it's being described means that either the language is bad or that the coding is even worse. Even if it was just already compiled (i.e., in the format that a machine would read it in/run it with, rather than in the format that would make it easy for humans to read/write), it shouldn't be as amazingly tangled.
|
|
|
Post by hal9000 on Jun 2, 2011 10:05:25 GMT
Something that's really been bugging me is that. . . well. . . a good computing language is easy to decipher and comprehend. It being so big and hairy and complex as it's being described means that either the language is bad or that the coding is even worse. Even if it was just already compiled (i.e., in the format that a machine would read it in/run it with, rather than in the format that would make it easy for humans to read/write), it shouldn't be as amazingly tangled. Machine-generated code might be optimized for efficiency and not necessarily ease of debugging / maintenance. The assembly generated by gcc with -O3 can be harder to understand compared to the assembly generated without it, for instance, and the robots probably don't have much use for human-readable debugging information (or whitespace, for that matter). Or it could just be, like you said, that the language used here is a bad one. I mean, it was presumably created by a guy some hundreds of years before anyone even invented modern programming languages, let alone easily-maintainable/well-designed ones.
|
|
notacat
Full Member
That's not me, that's my late cat Mimi: I'm not nearly so cute
Posts: 188
|
Post by notacat on Jun 2, 2011 11:22:34 GMT
Something that's really been bugging me is that. . . well. . . a good computing language is easy to decipher and comprehend. It being so big and hairy and complex as it's being described means that either the language is bad or that the coding is even worse. Even if it was just already compiled (i.e., in the format that a machine would read it in/run it with, rather than in the format that would make it easy for humans to read/write), it shouldn't be as amazingly tangled. Smalltalk is a wonderfully simple and easy language to read. A child can understand the language and concepts (no really, they used children as play-testers in the early development stages ). I would not want to try to decipher a printout of the default system image, even with pretty-printing and syntax highlighting and all the bells and whistles ;D.
|
|
percival
Full Member
there's a storm a-brewin'
Posts: 119
|
Post by percival on Jun 2, 2011 11:32:39 GMT
Sure is programmer/analyst in here.
|
|
|
Post by Chemical Rascal on Jun 2, 2011 11:48:48 GMT
Oh, come on Tom. What kind of person writes "scribble" next to their arcane notes, seriously?
|
|
|
Post by goldenknots on Jun 2, 2011 12:18:20 GMT
Oh, come on Tom. What kind of person writes "scribble" next to their arcane notes, seriously? I'm pretty sure that's just "sound effects", rendered visually. Think "POW" written next to a left hook.
|
|
|
Post by jayne on Jun 2, 2011 19:57:22 GMT
Something that's really been bugging me is that. . . well. . . a good computing language is easy to decipher and comprehend. It being so big and hairy and complex as it's being described means that either the language is bad or that the coding is even worse. Even if it was just already compiled (i.e., in the format that a machine would read it in/run it with, rather than in the format that would make it easy for humans to read/write), it shouldn't be as amazingly tangled. A good computer language is easy to comprehend *by the human user* This language *may* be easy to comprehend by a robot user but barely comprehensible to humans because of the way humans think. The way I think of that is like a robot may be able to listen to 10 people talking at the same time and comprehend everything while a human would have a hard time picking up everything if even 2 people are talking at the same time.
|
|
|
Post by Geekette on Jun 3, 2011 5:24:58 GMT
A good computer language is easy to comprehend *by the human user* This language *may* be easy to comprehend by a robot user but barely comprehensible to humans because of the way humans think. The way I think of that is like a robot may be able to listen to 10 people talking at the same time and comprehend everything while a human would have a hard time picking up everything if even 2 people are talking at the same time. A high-level language should be easy for the human user to comprehend, but the lower you go through the levels, the less so that is. Compiler languages like Java and C are a cakewalk. Assembly languages can be read, but they take more effort and written code is much less abstract; it doesn't have the rules that make compiler codes so nice to work with. Absolute machine code is binary, just 0s and 1s, and is definitely not meant for your regular user. You might be able to add/subtract/multiply in binary, but to actually read binary code takes a lot of experience. Another point to keep in mind is that, a part of the reason this was the case, is because making the languages easy for people is a newer aim. The lower you go, the older the languages are. The aims were not the users, but to have the machine - which in itself, is still actually pretty new and awfully complex - work. Technicians could be trained after all. It'd be interesting if Kat could copy that evolution of programming, and make Deigo's language into something other people could read. What I wonder over, is whether Deigo's code is very low level (possibly lower than binary) as our current technology trend would imply, or if its the other way around. Could this language be even more abstract (ie higher level) to the point that its the human way of thinking of sentences and breaking up code that is the issue? Sort of how we jumped from procedural to object oriented; a very different way of thinking is required.
|
|
|
Post by smjjames on Jun 3, 2011 6:26:01 GMT
Theres still the question on whether the robots changed or added any of the code, or did any kind of alteration, and what exactly they did. Sure some alteration could be required for changeover to modern electronics, but how much is truly origional coding?
|
|
|
Post by jayne on Jun 3, 2011 12:10:58 GMT
A good computer language is easy to comprehend *by the human user* This language *may* be easy to comprehend by a robot user but barely comprehensible to humans because of the way humans think. The way I think of that is like a robot may be able to listen to 10 people talking at the same time and comprehend everything while a human would have a hard time picking up everything if even 2 people are talking at the same time. A high-level language should be easy for the human user to comprehend, but the lower you go through the levels, the less so that is. Compiler languages like Java and C are a cakewalk. Assembly languages can be read, but they take more effort and written code is much less abstract; it doesn't have the rules that make compiler codes so nice to work with. Absolute machine code is binary, just 0s and 1s, and is definitely not meant for your regular user. You might be able to add/subtract/multiply in binary, but to actually read binary code takes a lot of experience. Another point to keep in mind is that, a part of the reason this was the case, is because making the languages easy for people is a newer aim. The lower you go, the older the languages are. The aims were not the users, but to have the machine - which in itself, is still actually pretty new and awfully complex - work. Technicians could be trained after all. It'd be interesting if Kat could copy that evolution of programming, and make Deigo's language into something other people could read. What I wonder over, is whether Deigo's code is very low level (possibly lower than binary) as our current technology trend would imply, or if its the other way around. Could this language be even more abstract (ie higher level) to the point that its the human way of thinking of sentences and breaking up code that is the issue? Sort of how we jumped from procedural to object oriented; a very different way of thinking is required. I don't believe there is a lower level than binary. Either the current is low (0) or high (1). You could increase the level of complexity if the machine could handle more than just off and on, 0 and 1. I was wondering why the robots wouldn't code in straight machine language since they are also machines. It might be they are coding in machine language but these machines use a greater complexity than just binary.
|
|
|
Post by atteSmythe on Jun 3, 2011 19:19:33 GMT
I don't believe there is a lower level than binary. Either the current is low (0) or high (1). You could increase the level of complexity if the machine could handle more than just off and on, 0 and 1. Everything's analog. For the discussion a couple comics back, I wanted to find an article about a story I'd read a long time ago, but I couldn't find anything relevant. So, paraphrasing from memory: There was a professor doing work with genetic algorithms programming an FPGA (programmable logic device). The FPGA was hooked up to a microphone as input, and had some sort of device it could output to (voltmeter or light, I can't recall). The goal of the algorithms was to be able to tell the difference between two audible tones - quite a feat, since the frequencies of these tones was very low compared to the speed of the logic device. The algorithms were scored based on which ones could do it with the fewest logic elements. After some time, the algorithms weren't just successful, but wildly so. In fact, they used a tiny fraction of the gates that would otherwise be necessary. Delighted, papers were written, and a presentation was planned. During the presentation, the researcher went to demonstrate the device he'd just talked about and...nothing! It didn't work at all. Long story short, it turns out that each 'solution' only worked on the exact hardware on which it was evolved. It turns out that it was so very efficient because it wasn't using the gates 'properly,' it was operating them in the grey zones between bit states, relying on leakage currents and other flaws and peculiarities of that particular silicon in order to work. The hardware dependency is kind of interesting as a discussion of the complications of this task, but the real point of the story (in this context, anyway) is that there's plenty of room to run digital devices in an analog manner, and that complexity can arise from doing so.
|
|