Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
some kinda jackal
Feb 25, 2003

 
 

Zuph posted:

Mine came with a micro-USB cable.

Mine did too, I just don't like having a ton of different cables lying around. All my other dev boards use mini usb (aside from the Arduino obviously) and I like just having one cable I need plugged in for development :(

Of course once my Bus Blaster JTAG thing shows up this'll be a moot point :)

Adbot
ADBOT LOVES YOU

peepsalot
Apr 24, 2007

        PEEP THIS...
           BITCH!

Martytoof posted:

Mine did too, I just don't like having a ton of different cables lying around. All my other dev boards use mini usb (aside from the Arduino obviously) and I like just having one cable I need plugged in for development :(

Of course once my Bus Blaster JTAG thing shows up this'll be a moot point :)

I've got the opposite problem. All my phones for the past few years have been micro usb so I've collected a bunch of those cables, and the mini usb cables are the odd ones i can never find. So I got a bunch of mini to micro adapters to plug on the ends of all my micro cables. Saves having an extra cable in my geek bag when traveling, etc.

Blotto Skorzany
Nov 7, 2008

He's a PSoC, loose and runnin'
came the whisper from each lip
And he's here to do some business with
the bad ADC on his chip
bad ADC on his chiiiiip

movax posted:

Totally missed it, but it does look pretty slick, and I think it'll be fine for hobbyist/etc use. The configurable blocks are pretty slick; need a ton of PWM for motor control? No problem! gently caress ton of SPI/I2C for weird reasons? Still good!

Paging Otto to this thread to rant about how the PSoC has screwed him over repeatedly (though I think he was having problems with a specific version).


<-------------------





I and my coworkers have certainly had plenty of issues with the PSoC 5.

The worst of these have been on the analog side of things. There are "unpleasant" stability issues mostly stemming from the Delta-Sigma ADC component misconfiguring its charge pump clock, and unrepeatability issues related to a whole host of clock interactions unforeseen by their design team and an input buffer with numerous nasty issues including what looks like a race hazard (if you can tolerate bypassing the buffer for your application, do so; we couldn't). While some of these issues exist on the PSoC3, they are roughly 4x smaller in magnitude based on the measurements I and a couple of coworkers and some engineers at Cypress have done. This is a recurring theme in PSoC land: the PSoC 3 has some warts but is generally a working and finished product, whereas the PSoC 5 is engineering silicon that you get charged money for.

On the digital side, things are generally much better but there are still plenty of annoyances. The UDBs are great (although confirm the maximum voltage and current and slew rate ratings on inputs and outputs thoroughly by empirical means before you believe that datasheet!) and the DMA capabilities are fantastic, but many of the stock components are poorly coded, generally assuming that
  1. nothing has changed at run-time from design- or compile-time settings (many components, eg the SD card component, break if you change your master clock to save power and provide no way to tell them you've changed clock speed or provide an external clock that doesn't change) and
  2. no error conditions will ever occur, and if they do deadlock is an acceptable way to handle them (the code for many components is riddled with infinite loops, eg. the I2C component will get stuck forever if it misses an ACK, there is no way to set a timeout, and the normal hooks that most components provide in their source are not present so you can't fix it yourself without rewriting the whole thing)
On the other hand, this has given me a great education in how to implement a watchdog :rimshot:

There are of course some weird quirks and bugs with PSoC creator itself that are mostly in line with what you see in freebie embedded environments - the rubber banding of components gets confused and shits itself more often than not, if you delete or rename a component it leaves the generated source with the old name hanging around in the generated source directory but removes it from the project (this leads to fun errors - once I had an input to an Amux called 'Battery'. I changed the name to Vbb. I later created some routines to calculate percent charge remaining based on a calibrated discharge curve and put the code in src/routines/battery.c and the constants in include/routines/battery.h. battery.c wouldn't compile for the longest time, saying that none of the constants defined in battery.h were defined. This was because somewhere in the generated source directory, there still lingered a Battery.h from when Vbb had been called Battery which was getting included instead), every time I start up my TopDesign.cysch seems to be at a different zoom level, the .cyprj file generates merge conflicts incessantly and generally is a good example of how not to implement and XML-based file format, phantom errors are sometimes generated on builds that disappear when you click their item in the notice list, the debugger gets easily confused, can't step over a breakpoint or into a function reliably and even loses track of the program counter(!)(this was all fine in Creator 2.0 but broke in 2.1), etc etc.


There are also a bunch of acknowledged errata and limitations of the PSoC 5 that are supposedly getting fixed in the upcoming LP version of the silicon. I'm not holding my breath, and after chasing ADC issues for a couple of months and farming out half the crap we had wanted the PSoC to do back to dedicated external chips due either to limitations or errata (eg. USB is now handled by and FTDI chip in our design), we're not going to bother with other products of theirs where I work. Sorry to the two apps engineers who were immensely helpful and tirelessly fought to have problems fixed before being told to work on other things because we weren't a high enough volume customer to be worth the time.




All this being said, I wouldn't worry at all about using a PSoC for a hobby design.

At home you don't give enough of a poo poo about power to downclock dynamically, so a bunch of component issues go away.

At home you can turn the thing on and off if something gets stuck in a loop so you don't care if a component deadlocks because some jackass in Bangalore thought while(1) was the greatest thing since sliced bread.

At home you don't need 17 bits of noise-free resolution from a 20 bit ADC with perfect linearity (after correction) and no offset (after correction) and no hysteresis and no tempco (after correction) and by the way could you give me ten updates a second after filtering, oh wait I meant thirty how silly of me!


and so on.

some kinda jackal
Feb 25, 2003

 
 
Completely philosophical question:

Why would I ever set my peripheral clock to run at less than peak clock speed?

I'm looking at the clock tree for this cortex processor and I have my PLL driving my SYSCLK at 72mhz. I then select a prescaler to drive PCLK1 and PCLK2. PCLK1 appears to be documented to run at a maximum of 36mhz so I would need to select a prescaler of /2 for that particular clock, but I see that I have the option of driving it anywhere from /1 (which would be invalid, I assume) to /16.

So I guess I'm left wondering why I would ever drive the clock that low? Is it just something that is possible but would never really be done in the real world, or is it a "balance power draw and necessary speed" sort of situation?

edit: I just saw that I can pre-prescale the entire peripheral bus by /512 to begin with :q:

peepsalot
Apr 24, 2007

        PEEP THIS...
           BITCH!

Somethin weird goin on, where my User CP is showing that there is one new post in this thread by Martyoof, but no matter how many times I click the thread it never shows the new post. Maybe this post will fix it?

e: Yeah it made it show up, weird.

Victor
Jun 18, 2004

Martytoof posted:

is it a "balance power draw and necessary speed" sort of situation?
Power might be a huge consideration. A lot of power is used to switch a transistor, vs. just let it stay at whatever level it's at. If you're running on a battery, or just want to be green, you care about this stuff. But having to care about this stuff is annoying, so most people don't unless they're forced to.

Poopernickel
Oct 28, 2005

electricity bad
Fun Shoe
I'm working on a system that will use RS-485 to poll various sensor data from 25 different devices. Currently, I'm planning the interface around a 115.2k serial port.

My question is: How should I manage the data coming back?

I'm intending to write a GUI with a few running plots that be be set to view any of the data in real-time (I'm planning for 4 updates per second on the GUI, and 8 updates per second coming back from each sensor). I've been thinking about how to structure the management application. Ideally, I'd like to decouple the GUI from the actual data collector (which would be some kind of python app that sent queries out the serial port and logged the responses).

But if I went this route, what would be the best way to get that data from my data collector back into the GUI? What if I also wanted it to log the data to disk (either by using a database, or by flushing once per minute or so and storing the last 24 hours worth)?

Is a SQLite database worth using for this kind of thing? If so, do I need to worry about having one program (the data collector) writing to it and deleting old records while the other program (the GUI) is trying to access it? Would I even be able to get reasonable performance out of it? I don't have a good feel for whether 100 SQL writes and maybe 500-600 SQL reads per second and is totally impractical. Thoughts?

armorer
Aug 6, 2012

I like metal.
Without knowing what kind of hardware the python program would be running on, it is tough to say for sure. That is higher throughput than I would expect to be able to get from SQLite (or mysql) though. You are looking at a lot of overhead if you try to write each measurement individually in it's own transaction. You might be able to batch write those 100 records each second, or structure the table so that a row corresponds to a second of data or something rather than a single measurement. (Also, wouldn't it be 200 writes per second? 25*8?)

I am curious why you expect to have so many reads? I honestly expect that you won't even get close to those performance numbers if you are running this on a standard issue desktop computer. I would suggest writing up a little python app that just loops and writes fake (but representative) data to your intended database target to see how the machine performs.

McGlockenshire
Dec 16, 2005

GOLLOCKS!

Poopernickel posted:

Is a SQLite database worth using for this kind of thing? If so, do I need to worry about having one program (the data collector) writing to it and deleting old records while the other program (the GUI) is trying to access it?

Activating the journal_mode=WAL pragma to use the 3.7.0+ writeahead log can provide a huge boost to sanity when you have concurrent readers and writers. Just be sure to understand the disadvantages listed. I'm still not sure if it'd still be fast enough here, but it could work.

McGlockenshire fucked around with this message at 18:35 on Dec 20, 2012

Phobeste
Apr 9, 2006

never, like, count out Touchdown Tom, man

Martytoof posted:

Completely philosophical question:

Why would I ever set my peripheral clock to run at less than peak clock speed?

I'm looking at the clock tree for this cortex processor and I have my PLL driving my SYSCLK at 72mhz. I then select a prescaler to drive PCLK1 and PCLK2. PCLK1 appears to be documented to run at a maximum of 36mhz so I would need to select a prescaler of /2 for that particular clock, but I see that I have the option of driving it anywhere from /1 (which would be invalid, I assume) to /16.

So I guess I'm left wondering why I would ever drive the clock that low? Is it just something that is possible but would never really be done in the real world, or is it a "balance power draw and necessary speed" sort of situation?

edit: I just saw that I can pre-prescale the entire peripheral bus by /512 to begin with :q:

Entirely power draw. That, and if you really need some other component to operate slowly, you might need to pre-divide.

Also, for all you MSP430-users out there: check out GRACE, the graphical peripheral setup tool included in CodeComposer. For anything where you only have to set things up once and then write other code, GRACE is great. Easy tutorials on how to get poo poo set up right for common use cases, it's all good.

Also all buggy, but what can you do.

Poopernickel
Oct 28, 2005

electricity bad
Fun Shoe

armorer posted:

Without knowing what kind of hardware the python program would be running on, it is tough to say for sure. That is higher throughput than I would expect to be able to get from SQLite (or mysql) though. You are looking at a lot of overhead if you try to write each measurement individually in it's own transaction. You might be able to batch write those 100 records each second, or structure the table so that a row corresponds to a second of data or something rather than a single measurement. (Also, wouldn't it be 200 writes per second? 25*8?)

I am curious why you expect to have so many reads? I honestly expect that you won't even get close to those performance numbers if you are running this on a standard issue desktop computer. I would suggest writing up a little python app that just loops and writes fake (but representative) data to your intended database target to see how the machine performs.

Basically, the project I'm working on is a big stack of DC-DC converters that step 375VDC down to various user-selectable voltages. We're planning to package 12 of them for a particular niche application, with an RS-485 uplink back to the base station.

The input and output of each converter have a little opto-isolated daughter board with a microcontroller that takes measurements of the voltage, current, and leakage current. The input-side board also sends some control signals to the DC-DC converters. I'd like to be able to monitor all of the various signals simultaneously and record the data. There's no strict requirement on how often I measure them, but I figure I can get 8-10 samples per second on all measurements in the system if I keep the RS-485 bus saturated at 115.2kbps.

I've done plenty of microcontroller firmware in the past, so that's no big deal. However, this time around I'm also in charge of the PC-side host application. And I'm trying to get my head around exactly how I should deal with all of the data (how I should record it, where the serial port interface should live, etc).

On the host PC, it's not too hard to write an application to poll the RS-485 bus fast enough. But I'm scratching my head at how I should structure the rest of the system. I want to be able to log all of the data as it comes in, and then also display some of the data on a host GUI (varied, depending on which screen is up).

My initial thought was that I could do it via SQLite, with my serial application polling data and writing to a database that the GUI could read from at its leisure. But before I go and code it all up, is this approach the 'right' one to take? Would you recommend a different topology?

armorer
Aug 6, 2012

I like metal.

Poopernickel posted:

...

On the host PC, it's not too hard to write an application to poll the RS-485 bus fast enough. But I'm scratching my head at how I should structure the rest of the system. I want to be able to log all of the data as it comes in, and then also display some of the data on a host GUI (varied, depending on which screen is up).

My initial thought was that I could do it via SQLite, with my serial application polling data and writing to a database that the GUI could read from at its leisure. But before I go and code it all up, is this approach the 'right' one to take? Would you recommend a different topology?

Most of my professional experience is on the PC software side, and I do the electronics / embedded software stuff as a hobby. I have worked on large backend data processing systems in the past, and I would not expect to get transactional throughput like that without specialized hardware and/or some serious performance tuning. That said, I have been dealing with much larger records than what it sounds like you will be writing. It sounds like you are mostly concerned with small numeric vectors, so it may work out.

From what you have said, I definitely think it is a good idea to decouple the polling component from the GUI. I would probably design the system such that one process was (or several were, if necessary) responsible for polling the serial data and writing that out. The big question is can you write it out to a database fast enough to keep up. Assuming that you can write the data to a database fast enough, then you could obviously write a separate GUI application which reads from the database periodically and refreshes the screen. At that point you could also have that GUI client be a standalone app which you could run on several PCs at once if necessary (assuming that the backend database could support whatever load that caused.) You could even get fancy and build features into the GUI that let the user scroll around through the data, zoom in, dynamically adjust the axes, print, and all that good stuff.

Do you have a requirement that the data be available long term? Or can you discard data that is older than some timeframe? (Your table will become very long and skinny over time if you can't clear out old data.) Is it OK if there is some lag time between data acquisition and display? (ie: can the GUI lag a few seconds behind the measurement, or does it need to show what you are reading *now*?) Is it ok if you lose some measurements (ie: does it even have to be transacted, and/or can you skip some readings if you lag too much in order to catch up)?

Assuming that each of the three measurements you mention can be captured in 32 bits, and that you add a 32 bit timestamp, you have 16 bytes of data you are collecting 8 times / second from 25 sensors, or 3200 bytes ~ 3MB / sec. If this really is representative of the data you plan to write, I would suggest trying it out with a small mysql (or SQLLite, or postgres) database with a simple loop that writes random floats to the database in a loop to see how it performs. This will give you a much better idea of what you can expect to get out of your hardware than my back of the envelope speculation. Keep in mind that on the real system, some CPU will also be taken up polling for data. I don't really know off hand how much that will tax the system.

I would say that a reasonable first pass would be to try batch writing the measurements in one second chunks (batches of 200 measurements from the 8*25 metric). That will cut down on your transaction overhead. I have done this sort of thing before using an in memory queue where the measurements are written into a queue and the consumer reads them out into batches. The queue consumer just reads stuff off the queue and remembers it until it hits the batch limit, at which point it writes it all out. You would basically have a multi-threaded process with a few threads responsible for the serial polling and one responsible for the database writing. The GUI would be an entirely separate app, most likely with a read only login for the database.

some kinda jackal
Feb 25, 2003

 
 
If you're not really fussed about impeccable accuracy in your timers and such, is there any reason to not use an external crystal to drive a PLL loop over just the internal RC oscilator? If so, is there a rule of thumb for how the internal oscilator will fluctuate over time? I can't find anything like that in the datasheet for the NXP chip I'm using. If my application is using a timer but counting seconds are we talking like a fluctuation of a few seconds over a minute or like microseconds over an hour or what. Just curious about a ballpark.

Obviously I mean building a board from scratch, not disabling the xtal on an existing dev board ;)

some kinda jackal fucked around with this message at 05:29 on Dec 21, 2012

Base Emitter
Apr 1, 2012

?
I got curious, so I took a quick look at a datasheet randomly picked off the NXP site (LPC1300), which says the internal clock is accurate to 1% over the full supply voltage and temperature range of the part (10.4 in http://www.nxp.com/documents/data_sheet/LPC1311_13_42_43.pdf). So, a clock could drift up to 1 second in 100 seconds.

some kinda jackal
Feb 25, 2003

 
 
Oh awesome, thanks for that. I'm using the LPC1768 but I assume it's pretty similar. I must have missed the accuracy portion in the 800+ page manual :smith:

e: Duh, or maybe I should actually look in the DATA SHEET and not the MANUAL :downs:

movax
Aug 30, 2008

The internal (F)RC is good for most things; most micros will demand an XTAL for the precision required of things like USB though. Even UARTs might drift with the RC oscillator, but sometimes both ends of the link can tolerate the baud-rate drifting a little bit.

I feel like I wrote about this in this very thread or in the DIY Electronics thread. So late. :(

Poopernickel
Oct 28, 2005

electricity bad
Fun Shoe

armorer posted:

- interesting responses -

Mocking up a little application is a great idea - especially because this is sounding more and more feasible as I dig into it. I could probably do my database operations in chunks as you describe.

I don't need to log forever, so I was planning for the database to have 4 hours worth of data, and then every four hours it could be copied into a folder that always keeps the 24 hours' worth of logs.

So here's the next question - I'm going back and forth right now in my head about how to get info into the GUI. I'm torn between designing the GUI to read from my database (which has some lag issues: when the user turns a power supply on or off, he'll have to wait to see it happen) versus an approach where the logging application spams all of its data to STDOUT, and is launched with STDOUT connected to a pipe to the GUI app.

What do you think about that approach for the GUI accepting info? Is STDOUT fast enough to be used like that? Would the approach work if somebody wanted to port the application to Windows?

Blotto Skorzany
Nov 7, 2008

He's a PSoC, loose and runnin'
came the whisper from each lip
And he's here to do some business with
the bad ADC on his chip
bad ADC on his chiiiiip
Why not use a named FIFO or a socket?

some kinda jackal
Feb 25, 2003

 
 
JTAG question:

If I want to use a 5 pin JTAG connection to program/debug a Cortex MCU on a dev board, should I be powering the chip from the JTAG adapter (it provides 3v3) or should I be powering the chip from its own power supply, or does it not matter as long as the chip meets minimum running voltage requirements?

I'm going to be wiring individual cables from my Bus Blaster JTAG card to my Cortex dev boards. I suppose I can either hook up the 3v3 line from the Bus Blaster and hook it into Vdd, or I can not hook up the 3v3 line and power the MCU boards over USB like I normally would.

I'm scouring datasheets for information so I avoid blowing my BB or MCU but so far programming info seems vague at best.



(I'm specifically not using SWD because the Bus Blaster needs to be re-flashed to support SWD and I don't really want to complicate things right now)

Krenzo
Nov 10, 2004
You power the chip like normal from its own supply.

Blotto Skorzany
Nov 7, 2008

He's a PSoC, loose and runnin'
came the whisper from each lip
And he's here to do some business with
the bad ADC on his chip
bad ADC on his chiiiiip

Martytoof posted:

JTAG question:

If I want to use a 5 pin JTAG connection to program/debug a Cortex MCU on a dev board, should I be powering the chip from the JTAG adapter (it provides 3v3) or should I be powering the chip from its own power supply, or does it not matter as long as the chip meets minimum running voltage requirements?

I'm going to be wiring individual cables from my Bus Blaster JTAG card to my Cortex dev boards. I suppose I can either hook up the 3v3 line from the Bus Blaster and hook it into Vdd, or I can not hook up the 3v3 line and power the MCU boards over USB like I normally would.

I'm scouring datasheets for information so I avoid blowing my BB or MCU but so far programming info seems vague at best.



(I'm specifically not using SWD because the Bus Blaster needs to be re-flashed to support SWD and I don't really want to complicate things right now)

You can do either but just powering the chip from its own supply is probably the easiest and most foolproof thing. Nb you won't be able to do profiling without the ETM pins and you may need a 20-pin connector to accomodate them instead of a 10-pin (but you may not care about profiling).

armorer
Aug 6, 2012

I like metal.

Poopernickel posted:

Mocking up a little application is a great idea - especially because this is sounding more and more feasible as I dig into it. I could probably do my database operations in chunks as you describe.

I don't need to log forever, so I was planning for the database to have 4 hours worth of data, and then every four hours it could be copied into a folder that always keeps the 24 hours' worth of logs.

So here's the next question - I'm going back and forth right now in my head about how to get info into the GUI. I'm torn between designing the GUI to read from my database (which has some lag issues: when the user turns a power supply on or off, he'll have to wait to see it happen) versus an approach where the logging application spams all of its data to STDOUT, and is launched with STDOUT connected to a pipe to the GUI app.

What do you think about that approach for the GUI accepting info? Is STDOUT fast enough to be used like that? Would the approach work if somebody wanted to port the application to Windows?

It might work, although it will almost certainly be more portable if you use the database as a shared repository. Using STDOUT like that is a bit like drinking from the firehose as well. If the GUI is supposed to just follow the latest data then it is probably fine, but if you want the user to be able to scroll around, zoom, change the axes and stuff, it might be difficult if data is still being appended to the graph constantly.

Also, if you want to support multiple GUIs for the same device, you will want the data in a database.

Thinking about this more though, another option that might give you the best of both worlds is something like this:

Have the GUI directly update from the data reading component, without any database in between. After updating the GUI, then drop the data onto a queue like I mentioned and let it be written to the database asynchronously. That way your display is updated as quickly as possible, but the data is still stored for logging purposes. Then you could have a mode in the GUI (maybe added later) where it opens up a historic dataset and lets the user scroll around in it. That mode would only read from the database though, and wouldn't have to worry about keeping up with the device in real time. If you take this approach, your database writes can be batched as large as you need and it doesn't really matter.

One downside to this approach is that if you kill the process it might not have finished logging everything that the user has seen in the GUI.

some kinda jackal
Feb 25, 2003

 
 
Thanks guys.

At this stage I'm not too fussed about profiling. I just picked up this Bus Blaster so I could standardize my development environment for all my boards, instead of using lpcxpresso for one and code composer studio for the other, etc :)

some kinda jackal
Feb 25, 2003

 
 
Can someone help me understand microcontroller memory maps? I'm getting kind of thrown off by the high values of memory listed. Like for example:

http://support.code-red-tech.com/CodeRedWiki/CM3_MemoryMap

Like I totally don't understand what the megabytes refer to. I'm completely thrown because I understand what memory is, and I think I understand memory mapping, but apparently not because I have no idea where large values like 511MB come from. Does this refer to some physical amount of memory or is it completely logical? I mean I think I already know the answer because there's no physical amount of memory like that on chip, but I'm having trouble connecting A to B.

edit: That is to say, I think I understand memory mapping to a point, and I understand how it's used to access peripherals and such, but there's so much more to it that I don't understand.

Base Emitter
Apr 1, 2012

?
It's just a map of what address ranges are reserved for what uses across all types of ARM Cortex processors by the ARM standard, regardless of vendor. Any given chip can/will have less than the reserved amount of memory for any given range, it's just a reservation. A 32-bit ARM has 32 bit addresses and can address up to 4GB in theory, so its divided up this way.

If you have a chip that has 64KB of RAM it'll probably be between 0x20000000 and 0x2000FFFF, other larger chips will populate more of the range reserved for RAM, but will never put a peripheral in that range.

The upper half a gigabyte looks like its for vendor-specific hackery, with the first 1MB of that half a gig reserved for specific things.

some kinda jackal
Feb 25, 2003

 
 
Ohh so it's potential. Not "there is 4gb of poo poo addressed on this chip", but there can be.

movax
Aug 30, 2008

With 4GB of address space, you can do things like bit-banding, and give individual bits of SFRs their own address in memory, making read/write operations really fast and efficient. The higher-end ARM MCUs will have bit-banding usually.

Blotto Skorzany
Nov 7, 2008

He's a PSoC, loose and runnin'
came the whisper from each lip
And he's here to do some business with
the bad ADC on his chip
bad ADC on his chiiiiip
Anyone done SIL systems before? Is it as painful as it looks?

KetTarma
Jul 25, 2003

Suffer not the lobbyist to live.
I'm trying to program an ATmega328 with an AVR ISP2. I am absolutely certain that I have all of the pins correctly connected. I have 9VDC coming from a homemade rectifier that's tied into the mains that goes through a 5VDC linear regulator. This 5VDC is hooked into the VCC pin on the ATmega. The ground connection is hooked up to one of the ground connections. I can confirm that the ATmega is getting proper voltage with an LED in parallel with the VCC/GND pins to indicate power.

What is causing AVR Studio 6 to say that I am reading 0.0VDC? Everything is connected via breadboard. I swapped in another processor and it gave me the same problem. I've programmed processors with this MCU in the past so I know it works. Further confusing me is that the status lights on the MCU are both solid green indicating that it is functioning correctly. All firmware is up to date.


help me make robot go :(

movax
Aug 30, 2008

KetTarma posted:

I'm trying to program an ATmega328 with an AVR ISP2. I am absolutely certain that I have all of the pins correctly connected. I have 9VDC coming from a homemade rectifier that's tied into the mains that goes through a 5VDC linear regulator. This 5VDC is hooked into the VCC pin on the ATmega. The ground connection is hooked up to one of the ground connections. I can confirm that the ATmega is getting proper voltage with an LED in parallel with the VCC/GND pins to indicate power.

What is causing AVR Studio 6 to say that I am reading 0.0VDC? Everything is connected via breadboard. I swapped in another processor and it gave me the same problem. I've programmed processors with this MCU in the past so I know it works. Further confusing me is that the status lights on the MCU are both solid green indicating that it is functioning correctly. All firmware is up to date.


help me make robot go :(

If it is anything like PIC/other programmers, part of the ICSP/ISP header pinout should be tied to VCC as well for target circuit voltage sensing, right? Is that hooked up properly?

KetTarma
Jul 25, 2003

Suffer not the lobbyist to live.
Yeah. I built a transformer/rectifier/filter board that takes 120VAC and outputs 10VDC. I put it through a linear regulator on my breadboard to get a very steady 5.00VDC. I wire that in to the +VCC, +AVCC, +AREF pins on the ATmega. The ISP2 +VCC connector is wired in parallel with +VCC. The linear regulator GND is wired in with the GND on the ATmega and the GND on the ISP2.

I've also tried this same setup with various capacitors in parallel with the +AVCC pin and GND because someone told me that it would serve as a low pass filter which would help the analog to digital converter circuit in the processor. I dont quite understand how that would work but hey, I'm still a freshman EE. :shrug:

Captain Capacitor
Jan 21, 2008

The code you say?
I came up with a project idea a few months ago and just sorta threw myself into embedded development. I'm pretty proud of myself to have gone from "Blink" to a custom LUFA device in about 3 weeks. Even taught myself how to solder (thanks to SparkFun's tutorials). Still have a lot to learn, I know.



Sadly there's only one parts store in my city and its selection is lacking.

EpicCodeMonkey
Feb 19, 2011

Captain Capacitor posted:

I came up with a project idea a few months ago and just sorta threw myself into embedded development. I'm pretty proud of myself to have gone from "Blink" to a custom LUFA device in about 3 weeks. Even taught myself how to solder (thanks to SparkFun's tutorials). Still have a lot to learn, I know.



Sadly there's only one parts store in my city and its selection is lacking.

Neat! I'm the author of LUFA. I'm damned impressed you managed to go from the Arduino environment (hand-holding GUI with C++ libraries) to LUFA (makefiles and notepad, yay) in such a short amount of time. Congrats!

jiggerypokery
Feb 1, 2012

...But I could hardly wait six months with a red hot jape like that under me belt.

Embedded programming seems a pretty hot topic for jobs in my area at the moment. Has anyone got any tips on getting some demonstratable skills in it for someone with a C++ background? In other words the kind of stuff that I should probably google myself but deserves its own "how to get into it" section in the OP?

Arcsech
Aug 5, 2008

jiggerypokery posted:

Embedded programming seems a pretty hot topic for jobs in my area at the moment. Has anyone got any tips on getting some demonstratable skills in it for someone with a C++ background? In other words the kind of stuff that I should probably google myself but deserves its own "how to get into it" section in the OP?

Buy an MSP430 Launchpad or a Stellaris Lauchpad or something like that and mess around with it for a while. DO NOT use an Arduino if you're attempting to get an embedded systems job, you will be laughed at all the way to the door. Right or wrong, the Arduino has a rep for being nothing but a kid's toy but the MSP430 line are some pretty commonly used microcontrollers, and of course ARM is beefy as hell (for a uC).

Captain Capacitor
Jan 21, 2008

The code you say?

EpicCodeMonkey posted:

Neat! I'm the author of LUFA. I'm damned impressed you managed to go from the Arduino environment (hand-holding GUI with C++ libraries) to LUFA (makefiles and notepad, yay) in such a short amount of time. Congrats!

Well it was thanks to you, then! You were kind enough to help me out just last week when my eyes mistook F_CPU for F_USB.

I was also pleased to find that there's an implementation of Google's protocol buffers available for embedded devices. If my project runs out of room it'll be the first thing to go but the flexible data structures are kind of nice.

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.

Captain Capacitor posted:

Well it was thanks to you, then! You were kind enough to help me out just last week when my eyes mistook F_CPU for F_USB.

I was also pleased to find that there's an implementation of Google's protocol buffers available for embedded devices. If my project runs out of room it'll be the first thing to go but the flexible data structures are kind of nice.

I didn't know they did. You have made me work on my UAV for the first time in weeks because comms code was sucking the life out of me.

KetTarma
Jul 25, 2003

Suffer not the lobbyist to live.

Arcsech posted:

Buy an MSP430 Launchpad or a Stellaris Lauchpad or something like that and mess around with it for a while. DO NOT use an Arduino if you're attempting to get an embedded systems job, you will be laughed at all the way to the door. Right or wrong, the Arduino has a rep for being nothing but a kid's toy but the MSP430 line are some pretty commonly used microcontrollers, and of course ARM is beefy as hell (for a uC).

Any other tips? I've been faithfully plugging away at Arduino projects lately :( Im an EE student that plans on minoring in CS. I want to do something related to either robotics or .. well, whatever hires me.

Delta-Wye
Sep 29, 2005
Rebuild your Arduino projects using a bare atmega chip. You know it can handle the task, so see how much more efficient you can make it. Faster, more robust, more feature-filled, more power efficient, whatever you'd like. There are plenty of reasons to use an Arduino (it certainly makes rapid prototyping easier if you just want to try something out) but "I don't know any other way" is a pretty bad reason to do anything. If you've got the time, rebuilding without the bootloader or Arduino libraries will teach you a lot.

Adbot
ADBOT LOVES YOU

some kinda jackal
Feb 25, 2003

 
 
Pick up any of those "PIC projects" or "AVR projects" books. The concepts and things should mostly be interchangeable if you're using C and not using some high-level library. You'll probably need to figure out what registers you need to poke to do the same thing, but a little bit of learning is good.

I did all these AVR tutorials on an ARM board and it was a great learning experience: http://newbiehack.com/MicrocontrollerTutorial.aspx

  • Locked thread