|
This seems like the most appropriate place to ask this ... I'm taking a computer architecture class that requires us to use the QtSpim MIPS simulator. For some homework I wrote a program that uses macros so I didn't have to repeatedly write out the tedious IO calls. Here's one: code:
|
# ? Sep 6, 2015 17:19 |
|
|
# ? May 11, 2024 10:14 |
|
Aurium posted:If the peaks might be able to source high current, and If your load is high impedance (like a uC input pin), a series (current limiting) resistor to a voltage divider to a zener diode shunt would be able to handle truly impressive over voltage conditions. So thinking back, I realized I said this wrong. If you put your current limiting resistor before your voltage divider, you'll throw your divider way off. You'd want to follow the divider with the limiting resistor. The other option would be to use high enough values in the divider that would inherently limit current. Which you'd probably be using anyway.
|
# ? Sep 6, 2015 22:44 |
|
I'm not sure if this should go here or into the C thread. I'm working on a little alarm on an Arduino and there's an 5 second window to enter the passcode using 3 buttons (will eventually be a keypad once I get one). I want the input to be registered when the button is released, not when it's pressed. I can't think of a way to detect the button going HIGH then LOW without blocking the 5 second timer while I wait for it to return to LOW. Here's some slightly modified code: code:
|
# ? Sep 7, 2015 21:17 |
|
I don't know about arduino, but could you poll it on each loop? If it is high, and on the next loop it's polled low, then you've got a button press!
|
# ? Sep 7, 2015 22:34 |
|
I'd think waiting for the button to unpress would also have the same multiple-hit issues. Buttons usually 'bounce' multiple times on both press and release. Unless you added in some dead-time after the un-press registering. Maybe you can get clever with Pin Change Interrupts and some hardware timers to both debounce the switch press and ensure that a button held down isn't registered multiple times? You'd likely need some kind of global state for that to work.
|
# ? Sep 8, 2015 03:55 |
|
Read out the clock value every loop, and if you read HIGH, wait until the loop that reads time + 20ms, and if you read LOW, bam, debounced button read. You don't need a blocking wait() or anything, just set a flag to come back to it later. Also fyi, a good rule of thumb is typical decent buttons need ~20ms of debounce time, 100ms for really sloppy presses. edit: actually read the code and tailored advice edit2: If you're new to embedded development, read up on state machines, this would be a good application for one ante fucked around with this message at 04:32 on Sep 8, 2015 |
# ? Sep 8, 2015 04:28 |
https://www.arduino.cc/en/Tutorial/Debounce
|
|
# ? Sep 8, 2015 14:11 |
|
Eeyo posted:I'd think waiting for the button to unpress would also have the same multiple-hit issues. Buttons usually 'bounce' multiple times on both press and release. Unless you added in some dead-time after the un-press registering. Ah, good point, I wasn't sure what was meant by "register", I assumed that the switch or board came with one ... On that note, it's a lot simpler to either add (sorted by decreasing laziness and increasing cost) an rc circuit or schmitt trigger to the switch, or to buy a switch with the circuitry already in it. I was trying to find you a few of the latter, but the only reason I know they exist is because I ordered them on accident. dougdrums fucked around with this message at 15:47 on Sep 8, 2015 |
# ? Sep 8, 2015 15:31 |
|
Thanks guys, it works great now. Here's what I ended up with, I didn't like how long the function was so I split some more stuff out of it too:code:
|
# ? Sep 8, 2015 19:40 |
dougdrums posted:Ah, good point, I wasn't sure what was meant by "register", I assumed that the switch or board came with one ... Tell your hardware engineer that he has to debounce the switch inputs when it's a pretty simple task in software and see how he reacts. A debouncer is really simple in software if you can get a periodic tick in the 1-20 ms range. We use 30 ms as our debounce time with a 15ms tick, so every 15 ms a routine to read inputs is called, and you keep 3 of these. If all three samples are 1, then set the input to 1. If all three samples are 0, set the input to 0. Any other case and you leave the last known value for the input. On Arduino, it looks like you can use this module to call your debounced regularly: http://playground.arduino.cc/Code/Timer using "every".
|
|
# ? Sep 9, 2015 02:26 |
|
The reaction will differ depending on what you're making. Sometimes the situation is more like "see how your firmware guy reacts when you tell him he needs to gently caress around with his simple comprehensible reliable design by polling or servicing interrupts more frequently and maintaining state to debounce eighteen buttons just so you could save eight cents on a design with a $3000 BOM". The other advantage of debouncing in hardware is that it is understood by nontechnicals to be much more of a PIA to change the schematic and/or layout than it is to change software, so it's easier to maintain a united front within the engineering department in telling your marketing guys to go suck an egg when they waffle infinitely over how debouncing should work (one of them will always think it's too long and misses legitimate keypresses, another will always think it's too short and doubles keypresses, no matter what duration you go with and regardless of whether you're edge or level triggered), whereas if you did it in software some of the hardware guys may be tempted to throw you under the bus by saying things like "yeah that shouldn't be that hard to tweak, go talk to $firmware_guy_currently_in_the_lab_instead_of_this_meeting about that, I'm sure he's not too busy...".
|
# ? Sep 9, 2015 02:45 |
Blotto Skorzany posted:The reaction will differ depending on what you're making. Sometimes the situation is more like "see how your firmware guy reacts when you tell him he needs to gently caress around with his simple comprehensible reliable design by polling or servicing interrupts more frequently and maintaining state to debounce eighteen buttons just so you could save eight cents on a design with a $3000 BOM". I think that's why the software approach makes more sense: rework boards to change R or C, vs changing a few constants and rolling out new firmware. I mean, yeah, this is based on my experience. We actually do both, though I'm not sure why. We have an RC front end with a not helpful time constant so we also do it in software because that's The Way We've Always Done It. The algorithm isn't complex, but you're right that if you have hard real-time requirements then adding another tick could be annoying, but the mechanism is still really simple.
|
|
# ? Sep 9, 2015 03:06 |
|
Oh, well I'm my own hardware guy, but that's kinda moot since I haven't sold a design in awhile. I figure it's a trade off like anything else, and doing it in software saves some cost and board space, and maybe someone else's time. I find it simpler to take the hardware approach, I mean if there's an engineer that freaks out over debouncing a switch when it's not a case of "oh, by the way ..." or for some reason you've gotten that far without it coming up, I'd be a bit worried. Actually I never just used an rc in anything important, I always get schmitt triggers or the dedicated ones. I've always been under the impression that you need the hysteresis to avoid any issues but it seems to work alright for everyone else? I think that's the answer for why you do both, you could still catch the switch in a bad state if it was rc alone, but if you do it in software too you get a sort of hysteresis. Also I wrote this post before I read the one above, that's pretty reasonable. I mean if someone just told me what the timings were and expected me to work with it, it's most certainly better to just do my job instead of griping about it for no reason. And if you're just learning, there's no harm in trying all of the approaches dougdrums fucked around with this message at 03:52 on Sep 9, 2015 |
# ? Sep 9, 2015 03:09 |
|
In GCC for ARM, is there a way to access SP and LR without dropping down into assembly? Inline assembly is kind of horrible to read and write when you can't use existing variables.
|
# ? Sep 14, 2015 04:11 |
|
Assuming you're using the CMSIS, core_cmFunc.h has __get_MSP() and __get_PSP() that abstract out the inline asm.
|
# ? Sep 14, 2015 08:06 |
|
I have been playing around with the ESP8266 and reading a lot of Lua code. It appears you can "compile" a file.lua in to a file.lc and gain (supposedly) significant performance increase and decrease in ram. How does this work, does it just get compiled as a C program might? Or am I oversimplifying? Apparently variables take up a substantial chunk of the available memory. Also it says this: https://github.com/esp8266/esp8266-wiki/wiki 64KBytes of instruction RAM 96KBytes of data RAM When I type print(node.heap()) I get back between 2400 and 9900 - is that ~8KB instruction or data ram? I'm running a slightly customized version of the NodeMCU firmware that has some DNS loopback stuff turned on but is built on the older 0.9.5 which has about 16KB fewer avalible memory than the newer 0.9.6. I'm thinking if I compile to .lc I can reclaim a good amount of memory and perhaps increase performance. Hadlock fucked around with this message at 08:45 on Sep 14, 2015 |
# ? Sep 14, 2015 08:43 |
|
... whoops, nm
dougdrums fucked around with this message at 16:10 on Sep 14, 2015 |
# ? Sep 14, 2015 16:01 |
|
Hadlock posted:I have been playing around with the ESP8266 and reading a lot of Lua code. It appears you can "compile" a file.lua in to a file.lc and gain (supposedly) significant performance increase and decrease in ram. How does this work, does it just get compiled as a C program might? Or am I oversimplifying? Apparently variables take up a substantial chunk of the available memory. I don't know about that tool specifically, but there exist compilers for Lua (such as LuaJIT) that turn a script into native code for your platform. Normally Lua compiles your script into its own bytecode at runtime which is then executed by an interpreter. Lua compilers will instead compile your script into native code which will generally execute a lot faster and save some memory. Also, compiling it ahead of time instead of at runtime will mean faster startup times and resource savings because the compiler doesn't have to be loaded and executed.
|
# ? Sep 14, 2015 16:13 |
|
Captain Cool posted:In GCC for ARM, is there a way to access SP and LR without dropping down into assembly? Inline assembly is kind of horrible to read and write when you can't use existing variables. You might find https://gcc.gnu.org/onlinedocs/gcc/Return-Address.html interesting?
|
# ? Sep 14, 2015 16:25 |
|
So this happened today. He was wearing a NASA meatball shirt when he was arrested. He's a Freshman in high school near Dallas, Texas http://www.dallasnews.com/news/comm...k-to-school.ece dallas morning news posted:IRVING — Ahmed Mohamed — who makes his own radios and repairs his own go-kart — hoped to impress his teachers when he brought a homemade clock to MacArthur High on Monday. Instead, the school phoned police about Ahmed’s circuit-stuffed pencil case. So the 14-year-old missed the student council meeting and took a trip in handcuffs to juvenile detention. His clock now sits in an evidence room. Police say they may yet charge him with making a hoax bomb — though they acknowledge he told everyone who would listen that it’s a clock. In the meantime, Ahmed’s been suspended, his father is upset and the Council on American-Islamic Relations is once again eyeing claims of Islamophobia in Irving. https://www.youtube.com/watch?v=3mW4w0Y1OXE
|
# ? Sep 16, 2015 06:58 |
I'm sorry, but we only like white nerds.
|
|
# ? Sep 16, 2015 07:14 |
|
Welp, good point gently caress that guy then, for being smart and poo poo. What an rear end in a top hat.
|
# ? Sep 16, 2015 07:16 |
|
quote:He kept the clock inside his school bag in English class, but the teacher complained when the alarm beeped in the middle of a lesson dude's an idiot. don't bring your beeping clock to school, you doof.
|
# ? Sep 16, 2015 07:31 |
|
Who the hell would make a bomb that ticks or beeps or whatever anyway? That would be some mind games poo poo.
|
# ? Sep 16, 2015 08:06 |
|
BattleMaster posted:Who the hell would make a bomb that ticks or beeps or whatever anyway? That would be some mind games poo poo. Counterstrike bruh
|
# ? Sep 16, 2015 13:38 |
|
Blotto Skorzany posted:Counterstrike bruh Makes sense, half the PDs think they're seal team six ever since DHS threw them military surplus.
|
# ? Sep 16, 2015 18:40 |
|
I fully expect to get a "why would you even ask this question" response, but hey google doesn't have all the answers, and that's how you learn... How come the Motorola 68000 never took off in the way that the AT328 did? It looks like the 68000 has (introduced?) SPI support. I2C obviously didn't exist in 1979. I guess the AT328 will oscillate on it's own at 1Mhz. The 68000 probably needs an external oscilator, external RAM, etc? Also it's CISC which I believe is quite a bit harder to write assembly for (that's my winter project...). Anyways, the 680X0 was ubiquitous in my childhood, so presumably it would be crazy cheap to produce today. I figured such a popular chip would have taken over instead of the Arduino. Is CISC just that old fashioned? I guess there are ARM M0 chips now that probably blow the 68000 out of the water...
|
# ? Sep 17, 2015 09:00 |
|
Freescale still produces 68K derivatives under their Coldfire, 683xx series, and Dragonball series MCUs and processors. Since they're still produced, I guess they're getting a decent amount of use. I think the main thing is competition from intel/arm and in earlier years the PPC architecture. I also think the currently offered MCUs are not quite as user/hobbyist friendly as things like Microchip/Atmel parts - you won't find them in DIP packages, and it's not straightforward to get support for them.
|
# ? Sep 17, 2015 15:06 |
|
The 68k derivatives ended up being popular in certain niche industries. The 683xx series is used in lots of comms/telecom gear.
|
# ? Sep 17, 2015 16:14 |
|
I saw some market research that a non-zero number of companies are planning to do a *new* design with a x186 in the next year. If there is a processor out there, someone will use it I guess.
|
# ? Sep 17, 2015 16:21 |
|
For the longest time, Motorola just did not care about the hobby market. Either you were big enough to buy 10k chips at a time, or you could go elsewhere. You might go, but resellers, but that worldview expands into so many areas. For example who needs good public facing documentation when they're not interested in the public, and anyone they're selling to has a direct line to an application engineer. Atmel is as friendly to the hobbyist as they come. The at328 owes much of its success to that, and has exploded because of the arduino. Looking at that you might wonder if any chip that had been selected for the first arduino been the success we're talking about. But the avr was selected because it already had a quality free hobbyist toolchain. It had a solid communications program and a great free compiler. The arduino just bundled it up in a nice package, and gave a user friendly utilities library. The traditional avr rival, the PIC, didn't even have the free compiler. They were either paid, or crippled. Either was they weren't open source and would have been much harder to bundle into something that was going to be redistributed.
|
# ? Sep 17, 2015 17:22 |
|
I don't really remember much about the 68k from school, but the Atmega8/168/328 only seems to have surpassed PICs in the hobbyist space once people started using them for Arduino derivatives (or at least people moving on from using Arduinos). The benefits I can think of off the top of my head: 1. Free compiler/dev environment from Atmel (which is pretty decent)...but also good GCC support with AVR Libc 2. Available in DIP packages 3. You only need the uC, a linear regulator and a programming header to get started 4. Cheap programmer (and now even cheaper unofficial programmers that are compatible with the open source toolchain) 5. Low power---you can run a dev board from a USB port 6. Atmegas are cheap as poo poo One downside is that Atmel does not give away free samples, which is a big deal for broke college kids. For that reason, I'm kinda surprised that there isn't more stuff being done with TI's MSP430 line---unlike other companies, TI has always been good about giving samples of DIP packaged parts, whereas others exclude them specifically because of hobbyists. I did a lot of stuff with Cypress PSOC's when I was in school because I was able to get free samples of breadboardable parts that did all kinds of cool stuff. I probably still have thousands of unused free samples of everything from motor controllers to 1 GHz ADCs to this day
|
# ? Sep 17, 2015 17:32 |
|
I ordered two samples of a PIC I hadn't used before from Microchip on a Saturday, and Monday morning at 8:30 the fedex guy was at my door with a package from Thailand, and now I'm on a sampling rampage too.
|
# ? Sep 17, 2015 21:15 |
|
Speaking of architectures that aren't as dead as you thought, Freescale still makes PPC chips for automotive and server applications. They've been steadily rolling out new cores for the past could of years.
|
# ? Sep 17, 2015 22:36 |
|
Does anyone know anything about USB virtual com port drivers? Specifically, I'm trying to figure out how to keep my device from assigning to a different serial port for every physical port on my laptop. Is this a limitation of using the default windows usb serial driver? Or can I modify my .inf file to change this behavior? I'd like it to work more like the USB-Serial adapters I use, where (i think) the 1st device connected (regardless of which USB port) is mapped to the same ContainerID (and the same COM port), and then a 2nd connected device will be mapped to a second ContainerID (and its associated COM port). Would I need to make a custom USB driver for this capability?
|
# ? Sep 17, 2015 23:56 |
|
Slanderer posted:I don't really remember much about the 68k from school, but the Atmega8/168/328 only seems to have surpassed PICs in the hobbyist space once people started using them for Arduino derivatives (or at least people moving on from using Arduinos). The benefits I can think of off the top of my head: dougdrums posted:I ordered two samples of a PIC I hadn't used before from Microchip on a Saturday, and Monday morning at 8:30 the fedex guy was at my door with a package from Thailand, and now I'm on a sampling rampage too. What? How does one get in on this sample binge?
|
# ? Sep 18, 2015 00:03 |
|
Ask.
|
# ? Sep 18, 2015 00:11 |
|
xilni posted:What? How does one get in on this sample binge? https://www.microchip.com/samples/default.aspx I'm a student right now, so I just registered with an .edu address with microchip and ordered up. There's a limit of two different types per order, but it hasn't been a problem. When I had an actual business before, they'd take 1-2 weeks or something like it said when you ordered, that's why I was so surprised. I actually ordered some from digikey when I realized that I needed to get them sooner, but they still arrived beforehand. Using a PIC16F1454 in this one design saved me a ton of stuff, so I decided to get some of their high-end ones to see how they compare to others in that regard. And I also needed some gigabit ethernet phy's, and maybe some Li-Poly managers ... Slanderer posted:One downside is that Atmel does not give away free samples, which is a big deal for broke college kids. For that reason, I'm kinda surprised that there isn't more stuff being done with TI's MSP430 line---unlike other companies, TI has always been good about giving samples of DIP packaged parts, whereas others exclude them specifically because of hobbyists. I did a lot of stuff with Cypress PSOC's when I was in school because I was able to get free samples of breadboardable parts that did all kinds of cool stuff. I have TI's memory display breakout, but I hooked it up to one of those ESP8266's to try to make a hand thermal-powered wifi computer terminal and it still drew too much power . I think I might get me some MSP430 samples and try again though. dougdrums fucked around with this message at 00:55 on Sep 18, 2015 |
# ? Sep 18, 2015 00:49 |
Slanderer posted:Does anyone know anything about USB virtual com port drivers? Specifically, I'm trying to figure out how to keep my device from assigning to a different serial port for every physical port on my laptop. Is this a limitation of using the default windows usb serial driver? Or can I modify my .inf file to change this behavior? I'd like it to work more like the USB-Serial adapters I use, where (i think) the 1st device connected (regardless of which USB port) is mapped to the same ContainerID (and the same COM port), and then a 2nd connected device will be mapped to a second ContainerID (and its associated COM port). I don't know, but this drives me crazy, too.
|
|
# ? Sep 18, 2015 03:56 |
|
|
# ? May 11, 2024 10:14 |
|
Slanderer posted:Does anyone know anything about USB virtual com port drivers? Specifically, I'm trying to figure out how to keep my device from assigning to a different serial port for every physical port on my laptop. Is this a limitation of using the default windows usb serial driver? Or can I modify my .inf file to change this behavior? I'd like it to work more like the USB-Serial adapters I use, where (i think) the 1st device connected (regardless of which USB port) is mapped to the same ContainerID (and the same COM port), and then a 2nd connected device will be mapped to a second ContainerID (and its associated COM port). Possibly http://blogs.msdn.com/b/oldnewthing/archive/2004/11/10/255047.aspx ?
|
# ? Sep 18, 2015 04:06 |