Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
movax
Aug 30, 2008

OctaviusBeaver posted:

I'm trying to drive a small DC motor from an arduino using an H bridge. Do I need to size the H bridge to handle the stall current of the DC motor or just the normal operating current plus a safety margin (but still not as high as the stall current)?

Personally, I'd size for at least 1.5x the normal operating current. Do you have any idea what in-rush current/transients can spike to? Safety margin(s) are always a good idea. If you have a chip rated for 1A, you have no idea if that chip barely passed QA or exceeded requirements with flying colours.

Also I am offended that PIC doesn't even get a picture in OP. PICs own, the devtools are great and the programmers are reliable as all hell. I love PIC24s/dsPICs; PIC32 isn't half shabby either.

e: you could separate a lot of the boards/CPUs in OP by whether they have a MMU or not, or are Linux (ignoring non-MMU ucLinux)-capable too.

Adbot
ADBOT LOVES YOU

movax
Aug 30, 2008

Zombietoof posted:

Thanks for all the help so far everyone.

For some reason my system is having some serious issues compiling CS Lite toolchain so I'm going to move on to one of two things.

I found YAGARTO which looks like a working toolchain with OSX support. It hasn't been tested with anything above 10.6, however, so if that doesn't work I'm going to claim defeat with my Mac for the time being and use VMware Fusion to set up a reliable IDE under a Linux of some flavour. It seems there's more information available.

I should probably just stop torturing myself and install a Windows VM to use the tools Freescale provides, but for some reason I'm just not looking forward to doing this under Windows. I may be digging myself into a hole here, and not because I dislike Windows or anything, just that I'd like to get this working on something else before that.

Honestly, Windows is probably the OS that will make you the waste the least time just getting stuff to work. All the major manufacturers have tools that run natively under Windows and treat that as their primary target platform. I've gotten up and running fastest under Windows compared to Linux (Fedora) and OS X. The only exception is probably Microchip ever since they released MPLAB X; it worked right out of the box on my Mac.

movax
Aug 30, 2008

Otto Skorzeny posted:

PS Cypress' programmer costs a hundred bucks because they threw an fpga in it, hehehehe

Yeah the Microchip Real ICE has a FPGA in it also. Haven't cracked it open but I think it's Altera of some flavour. Good tool with good drivers though, and stands up to abuse.

movax
Aug 30, 2008

Zombietoof posted:

Did anyone who ordered that Stellaris Launchpad board for five bucks apiece get their order?

Mine is sitting authorized but unprocessed with a ship date of 12/17/2012. I'm having a bit of a hard time believing it's taking them two months (from my purchase) to ship a pair of Launchpads.

If the price wasn't five bucks I'd probably cancel and just buy them somewhere else. AVnet wants $5 apiece too (with 399 in stock) but want $22 to ship to Canada so I'm not really ready to pay more for shipping than for two dev boards.





edit: If it seems like I'm jumping around back and forth with regards to my dev boards, I absolutely am. The Launchpad has prelim OpenOCD support and an available flasher that should work under OSX and Linux, and quite frankly I'm tired of fighting with the Freescale board. I was about to set up an XP environment to develop for it, but then I saw that the Launchpad is a happy camper under Linux/OSX, relatively speaking. For the record I have absolutely no end goal or project that I'm working on. I'm just trying to dick around with the platform. There are a few pet projects I want to try, but there's nothing that is keeping me from using an Arduino or PIC or something, other than curiosity about the ARM platform.

Yeah, I got in on the pre-order and got them at the end of the September I think, though I did use a corporate e-mail which might have helped.

What OS do you really want to develop under? Speaking as someone who had to deliver customer projects and not just blinky LEDs, I was able to move seamlessly between Windows and OS X with MPLAB X for a large PIC project, and my hardware (REAL ICE, USB) worked fine on each.

Added bonus, the IDE isn't even really necessary because they switched to using make for everything. So I could use Notepad++ on Windows and Sublime on OS X.

movax
Aug 30, 2008

Zombietoof posted:

If I had my choice I would be strict OSX. I've managed to cobble together a toolchain thanks to the advice in this thread, and I've gotten Eclipse CDT to work with that exceptionally well. Right now the weakest link is my Freescale board which, admittedly, was an impulse buy without much research. I've got two Stellaris launchpads in queue so I can play with one here and one at work.

I've definitely not ruled out PIC, and their developer support seems to be exceptional. The only real issue I've had with PIC is that I can't find any high profile eval boards. Granted that's partly because I haven't been looking very hard.

A major PIC dev platform is the Explorer 16, which can get pricey. Basically, you can buy the most powerful member of certain families of PICs on little plug-in-modules, and it has expansion slots for PICTails of various types (CAN, graphics, etc). The idea is to prototype on these, and then you cut down to the actual PIC you need (less flash/RAM/peripherals/whatever).

The PICKit 3 is a really inexpensive programmer/debugger that's a good start to use. The PICDem boards are decent, though some are a little dated. There's also some PIC32 dev boards with USB connectivity for debugging built in, but PIC32s are kind of in a weird spot, being a 32-bit MCU that's MIPS 4K-based whilst everyone and their mom is doing ARM these days.

I guess what I'm saying is there are some PIC eval boards out there, but certainly not anything as "nice" as Launchpads or various *duinos in terms of cost and relative complexity. It is actually not that hard to build your own PIC dev "board"; with the PICKit3 (which interfaces via 0.100" headers), a DIP PIC, and some basic parts, you could do it all on a breadboard, no joke.

quote:

I've just been skimming the surface of this and was hoping to jury rig what I had into working. As a complete aside (not that I'm accusing anyone of doing this) I definitely don't fault anyone here if they feel like rolling their eyes at me because I'm not really doing any serious research or putting a lot of muscle behind this. It's literally been at the "hey I have an arduino that I like to make LEDs blink on. What else is out there that I can play with? Oh this ARM thing looks pretty cool, let's throw ten bucks at it and see what happens" stage of a hobby. I'm really glad you guys haven't kicked me out on my rear end though :)

Everyone's got to start somewhere.

Jonny 290 posted:

I've been working on a project with Arduino and RPi, and I know I'm not working with 'real embedded' stuff yet, but I am in love with tiny computing devices and the things they can do. I have a Stellaris on order and want to play with the 430 stuff as well.

I'm assuming that it would behoove me to grind on C for a while? I write perl for a living and can hack through Arduino sketches no problem, but as for the finer (uglier?) points of 'real' C, I'm green.

Yeah, you want to get to know C real well. Bitwise stuff is very common, and various toolchains abuse the spec in different ways when it comes to instantiating and utilizing interrupt routines, DMA, and various other macros. Most MCUs lack a MMU, but you can still utilize pointers to great effect, especially considering your limited stack size.

You can get weird-rear end declarations like (compiler specific to XC16 here)
code:
// Configuration Bits
_FBS(BWRP_WRPROTECT_OFF & BSS_NO_FLASH & RBS_NO_RAM);
_FSS(SWRP_WRPROTECT_OFF & SSS_NO_FLASH & RSS_NO_RAM);
_FGS(GWRP_OFF & GSS_OFF);
_FOSCSEL(FNOSC_FRC & IESO_ON);
_FOSC(POSCMD_XT & OSCIOFNC_OFF & FCKSM_CSECMD);
_FWDT(WDTPOST_PS1 & WDTPRE_PR32 & WINDIS_OFF & FWDTEN_OFF);
_FPOR(FPWRT_PWR1 & LPOL_ON & HPOL_ON & PWMPIN_ON);
_FICD(ICS_PGD1 & JTAGEN_OFF);

ECAN1MSGBUF ecan1msgBuf __attribute__((space(dma), aligned(ECAN1_MSG_BUF_LENGTH*16)));
And depending on how nice the header files are structured, you get a lot of code that looks like this to setup various peripherals / other SFRs
code:
	// Bit-rate stuff
C1CFG1bits.BRP = 1;		// 500 kbps
C1CFG1bits.SJW = 0x3;	// SJW = 4 TQ	
C1CFG2bits.SEG1PH=0x7;	// Phase Segment 1 set to 8 TQ
C1CFG2bits.SEG2PHTS = 0x1;
C1CFG2bits.SEG2PH = 0x5;	
C1CFG2bits.PRSEG = 0x4;		// Propagation Segment = 5 TQ	
C1CFG2bits.SAM = 0x1;		// Triple-sample bus
And FWIW, I found the MSP430 docs to be worse than PIC docs; felt like I had to spend more time digging for information than actually programming.

e: Assembly can come into play, especially if you have a timing-sensitive ISR or something, but I don't think it needs to be a priority to know.

movax
Aug 30, 2008

KetTarma posted:

On this computer, yeah. I have Ubuntu on my desktop but dont really use it that much. Sorry if I lost my nerdcred.

There's no nerdcred to it, I use Windows for everything because the manufacturers support it, I've been using it for probably 15 years now, and oh yeah, I pretty much never have to gently caress with anything to get a toolchain up and running.

Yeah it's cool for stuff to run on other OSes, but to be honest, I'd rather the companies continue to invest their time and money into improving and perfecting their tools for one OS. They cater to corporations, not hobbyists, and for the most part it's Windows workstations with perhaps some Linux servers doing X11 forwarding for some high-end Cadence/Mentor simulation packages.

movax
Aug 30, 2008

Victor posted:

I have two objections, one of which is somewhat alleviated by PowerShell: the Windows shell sucks. Some things fit IDEs well, but some things really just don't. Second: package manager? Hello? Note that I used 100% Windows up until two years ago; I first transitioned to OSX because of work, then treated myself to a Retina MBP.

Agreed on the first one, the shell is definitely subpar compared to Linux/OS X; there's a project on Kickstarter that looks promising at least, and promises to work with cygwin, so at least git might become less painful to use for me at some point.

I'm not sure what you mean about package management though?

Dolex posted:

Windows (for *all* of its faults) is really great about having turnkey solutions to new technology. Getting certain MSKinect toolkits to compile/install in linux is a complete nightmare. In windows there are three .MSI files to click and we're running. I have a dedicated windows laptop as my Kinect manager for my projector installations.

Also, when OpenCL was "first released" it was nearly impossible to get an NVidia card properly configured in Linux to work with it, so I had a windows box to learn OpenCL development on.

Well to be fair, of course Windows would be the primary focus for Kinect development seeing as it's coming from Microsoft, and I'm sure the existing Xbox 360 SDK had something to do with it as well.

The Nvidia thing is a bit weird to me though (I do a lot of CUDA stuff); on one hand, Windows obviously enjoys a huge amount of software development resources thanks to :pcgaming: and workstation apps, but on the other hand, the supercomputers stuffed full of Teslas for GPGPU work are all running flavors of Linux, so you'd think they'd get some early loving as well.

To remain tangentially on topic though, it seems like AVR enjoys a large amount of cross-platform support, and a vast amount of that is open-source or at least free. Microchip's now cross-platform, but it's all closed-source and proprietary from Microchip (which I'm cool with, but others may not be). Seems like ARM varies based on the specific implementation though.

movax
Aug 30, 2008

Martytoof posted:

Does anyone have a good primer for multiplexing SPI slave select pins? By which I mean using one pin for multiple slave select signals, as I understand is possible.

I'm not really sure what this entails and I'm getting some muddled results googling, though I could just be plugging in the wrong terminology.

Maybe you've swapped the pin name; you can (within reason) share/multiplex the data and clock lines, and have one SS line per space device. If you really wanted to use a single pin to control multiple SS lines, you'd need say an inverter or some other trickery to share the line properly.

movax
Aug 30, 2008

Silver Alicorn posted:

Has anyone had any success with the USARTs on the STM32F05xxx? I'm trying to use my LaunchPad as a passthrough to USB, but I'm not sure if it's working or not. The TX pin will light up an LED when it's active, but connecting through puTTY doesn't show my test sequence. Don't know if it's the STM32 or the LaunchPad. Maybe I'll just get one of those USB<->UART cables that are $3 on eBay.

Do you have Tx->Rx set up properly? Should be crossed over, double-check the pin-names vs. what they actually are. Some people label their pins Tx and it's actually a receiver buffer where you hook up the external Rx line to it.

After we made a mistake at work swapping some PCIe lanes around in the same way, we instituted a policy where all parts with transmitters/receivers have the buffers drawn on the symbol to prevent any confusion with regard to pin names.

movax
Aug 30, 2008

peepsalot posted:

Did anyone here get in on the FreeSoC Kickstarter? It was funded a few months ago but I missed it. Actually I think I saw a link to it during the campaign, but didn't actually understand what it was at that point and just groaned about there being another arduino clone.

This thing looks pretty drat cool though. Basically this Cypress PSoC chip that it's based around has a bunch of super configurable peripherals where you can chain together all these things like analog mux, opamp buffers, DACs, etc, just from their snazzy free(of cost) dev tools. So you can customize the chip for whatever purpose you want.



I think I'm gonna have to pre-order one from their store even though the kickstarter rewards aren't scheduled to ship till Jan.

Totally missed it, but it does look pretty slick, and I think it'll be fine for hobbyist/etc use. The configurable blocks are pretty slick; need a ton of PWM for motor control? No problem! gently caress ton of SPI/I2C for weird reasons? Still good!

Paging Otto to this thread to rant about how the PSoC has screwed him over repeatedly (though I think he was having problems with a specific version).

movax
Aug 30, 2008

The internal (F)RC is good for most things; most micros will demand an XTAL for the precision required of things like USB though. Even UARTs might drift with the RC oscillator, but sometimes both ends of the link can tolerate the baud-rate drifting a little bit.

I feel like I wrote about this in this very thread or in the DIY Electronics thread. So late. :(

movax
Aug 30, 2008

With 4GB of address space, you can do things like bit-banding, and give individual bits of SFRs their own address in memory, making read/write operations really fast and efficient. The higher-end ARM MCUs will have bit-banding usually.

movax
Aug 30, 2008

KetTarma posted:

I'm trying to program an ATmega328 with an AVR ISP2. I am absolutely certain that I have all of the pins correctly connected. I have 9VDC coming from a homemade rectifier that's tied into the mains that goes through a 5VDC linear regulator. This 5VDC is hooked into the VCC pin on the ATmega. The ground connection is hooked up to one of the ground connections. I can confirm that the ATmega is getting proper voltage with an LED in parallel with the VCC/GND pins to indicate power.

What is causing AVR Studio 6 to say that I am reading 0.0VDC? Everything is connected via breadboard. I swapped in another processor and it gave me the same problem. I've programmed processors with this MCU in the past so I know it works. Further confusing me is that the status lights on the MCU are both solid green indicating that it is functioning correctly. All firmware is up to date.


help me make robot go :(

If it is anything like PIC/other programmers, part of the ICSP/ISP header pinout should be tied to VCC as well for target circuit voltage sensing, right? Is that hooked up properly?

movax
Aug 30, 2008

Delta-Wye posted:

Here is a weird question that I thought someone here may have some insight off the top of their head. When I'm doing embedded programming (say, MSP430) I usually use the smallest datatype available. If I know a value will fit into a char, I'll use a char, even if I am not particularly memory constrained. That is to say I don't just hold ASCII characters in a char datatype, but use it as a 8-bit integer. Supposedly this doesn't work very well on a platform like the MSP430. I was told that the 16-bit processors will read a 16-bit value out of memory, mask it to an 8 bit value, do the required operation, then have to mask it again with the contents of memory when storing it. Needless to say, such masking operations would add computational overhead so if you are not memory constrained it is more computationally efficient to use a 16-bit (int) value.

I haven't taken the time to dig up documentation that specifies one way or the other, or compile c code to assembly and see what it is doing, but am interested if anyone has insight. If it adds a ton of overhead, this would be a counterproductive habit I should break because I am rarely that memory constrained. For cases like a massive array of 8-bit samples or something, the computational overhead may be acceptable, but otherwise I've been doing a very silly thing.

I like re-using a global types.h per platform I develop on (sometimes the compiler has a good one) where I just typedef byte/word/dword/etc to the appropriate compiler type for that platform (union'ing them comes in handy too sometimes). The 430 does have byte opcodes though as peeps said.

Victor posted:

Heh, at my job we're switching away from an Arduino Mega to our own board with an STM32F4xx chip for our quadcopter flight board. I think a few questions could easily winnow the ranks of applicants. For example, ask them if they like the IDE (correct answer is it's nice for simple stuff and ridiculous for anything non-simple), if they like the TWI library (correct answer is it's nice for simple stuff but horrible for robustness), etc.

How are you liking Google protocol buffers on embedded devices, so far? I have to do the same thing at work for our quadcopter. We're using AeroQuad code now and it's... not as robust as what we'd like. (I probably ought to be politically correct.)

As anyone who has read my posts knows I've kind of got a little hatred for the Arduino, but even putting that aside, I can quickly knock out applicants by asking them to do bitwise operations in C / write a sample ISR during interviews. I don't mind reminding them of the syntax or anything like that, but some people can't even grasp the concept of bitwise operations or architecting an ISR without function calls. When I inquired a bit further, well, there are apparently communities out there convincing people that Arduino skills make you an embedded wizard and the rest of us wasted our time/money in school/elsewhere acquiring our skills.

movax
Aug 30, 2008

Otto Skorzeny posted:

The arduino libraries aren't the only bad I2C routines out there.

Everyone reinvents I2C libraries all the time, and some of the app notes companies put out have sample routines that aren't entirely production-level robust. IMO more companies should be putting out "offical" software libs / "extensive samples" for use by their customers. It can't take much more than 1 or 2 Sr. SW engineers + a week or two to release (even warranty-less) well-written, solid, interrupt-driven serial comm code for SPI/I2C and friends.

movax
Aug 30, 2008

Otto Skorzeny posted:

Haven't seen your code. Hope it's good. I2C is simple enough but it gets screwed up often. My recommendation is that you provide versions of your functions with specified timeouts.


In my post, however, I was referring to Cypress' I2C routines for the PSoC 3 and 5, which have a similar flaw that looks like

code:
...

send_a_thing();
while (!BIT_IN_A_SFR_WHICH_INDICATES_AN_ACK_WAS_RECEIVED);

...
Thus, if the slave doesn't get the first byte, it will hang forever waiting for a response that isn't coming. As I mentioned in an earlier post that consolidated my many and varied complaints about Cypress Semiconductor and the products thereof, on the bright it serves as a good excuse to gain experience with the implementation of a watchdog timer :v:

Yep, I did that exactly whilst writing some quick and dirty I2C code (I believe that is also in Microchip stuff). OR maybe it was waiting on a CAN Tx message, can't remember. Setting a timer or utilizing a RTOS where you have global WDTs will take care of issues like that though the latter is a bit heavy for most projects.

movax
Aug 30, 2008

Arcsech posted:

Anybody have an "Complete Idiot's Guide to Interrupts on the Stellaris Launchpad"? Because I've been beating my head against the wall with this drat thing.

All I want it to do is turn on the loving LED when I press the button, and I want it to do it with an interrupt. But it never hits the breakpoint I set at the start of the ISR. The example code they have with timer PWMs isn't really all that helpful.

Most interrupt controllers function similarly:

- There's an interrupt enable register
- Interrupt flag status register
- Possible interrupt vector/etc selection
- Enable Global Interrupts after everything is set up

It sounds like (assuming the debugger is configured properly) you're not enabling the interrupt properly (or the interrupt source isn't happening) if it's never breaking. Is it a change notification interrupt or something?

movax
Aug 30, 2008

I've got an AVR project coming up (making a Larsen scanner clone for a con) with the ATTiny2313A; anyone ever use TSB as a bootloader? What's the best AVR programmer / debugger to use? Similar to Microchip Real ICE and/or PICKit 3. The Dragon?

movax
Aug 30, 2008

Disnesquick posted:

I'm currently looking for an interface to run large numbers of sub-controllers (they will be controlling heterogeneous devices) from a single point of connection to a main computer. The bandwidth requirements are likely to be extremely low: In the order of 100 bytes/s at peak. The architecture that I've come up with involves a USB interface to a multiplexer, which can connect to other multiplexers and then to the final devices in a tree-like manner (number of endpoints needs to scale to 100s). Ideally, the endpoints would be left to do their thing and then report back to their master with an interrupt. I'll be using xmegas for this so I have about 34 pins for IO and would quite like to be able to run about 8 devices per multiplexer. The architecture I have in mind would therefore benefit from having device-select and interrupt share a line and then have a data bus shared between all sub-devices, with a similarly common clock. Questions:

1. Is this a stupid way of designing a scalable system. It really does need the ability to just bolt on more and more devices over time: Hence the tree architecture.
2. Is there any kind of interface standard that would fit the bill? Given the limited number of in-built com subs on each chip, I'm quite open to bit-banging, which should be ok given the low bandwidth requirements.

So if I understand correctly you have:

Computer -> USB -> Bridge Device -> "Master Nodes" --> "Sub Nodes"?

First thing that jumps to mind would be RS-485 or a CAN/LIN combination. But, you say you want each of the sub-devices to share a data bus, how far are they from their master? With the data rate as slow as you say it is, it's possible you could share SPI data lines among the sub modules and use chip-select to control whichever one you're talking too.

I don't think it's a terrible way of a scalable system; you have some master nodes each of which can support x number of baby nodes. As long as you're OK with the cost of a master node, it shouldn't be a problem. How far apart are all of these nodes?

movax
Aug 30, 2008

movax posted:

I've got an AVR project coming up (making a Larsen scanner clone for a con) with the ATTiny2313A; anyone ever use TSB as a bootloader? What's the best AVR programmer / debugger to use? Similar to Microchip Real ICE and/or PICKit 3. The Dragon?

OK, so I discovered the existence of the micronucleus bootloader, used on the ATTiny85 that's on the Digi-Spark. I'm thinking now I might do the USB connector on PCB thing, switch to a 20-pin ATTiny85 and figure out a way to cheaply have 5V USB input not destroy a coin cell battery when that is also plugged in. (Probably a Diode OR, and see how bad parasitic from power dissipation is)

Anyone know if the micronucleus-85 bootloader is easily portable to the 2313A?

movax fucked around with this message at 17:21 on Apr 12, 2013

movax
Aug 30, 2008

Disnesquick posted:

Not sure if you're on the same wavelength so I'll clarify:

Computer -> USB -> Root master -> sub masters 1..8 -> "Sub nodes" 1..64 (assuming 8 of the mulitplexers/sub masters)

Each master node / multiplexer would cost peanuts, since I intend to use a cheapo Uc for each one and IDC connections.

From what I understand, SPI doesn't allow the slave to initiate a data-frame, which would necessitate poling. I guess that would be ok but interrupts would be preferred.
CAN/LIN does look the business however, it seems to be pretty much what I'm looking for and there seem to be a few AVRs with hardware support. Cheers.

Ah, OK. Yeah, I think CAN/LIN would work; you could use LIN to talk from sub nodes to sub masters (you'd need 4 LIN buses though), and then CAN within the sub-masters / to root master. Or, you could calculate all the capacitance / cable length values and just make everything a CAN node.

movax
Aug 30, 2008

Yeah I just started using a Dragon as well, and while it's a bit clunky, it's certainly usable and the software isn't awful for it.

Today though I have like 8 hours to get a working USB bootloader running on the ATTiny4313; trying to port the micronucleus-85 one over :smithicide:

movax
Aug 30, 2008

I did give Victor the go-ahead to post that, and discussion is good just try to keep it civil is all. Looks like Victor is willing to answer questions honestly/openly which is nice!

movax
Aug 30, 2008

And to rebreathe some life into this thread...

Question re: IMU board, is that a 2-layer PCB still?

And re: AVR Studio, fuckin' ugh. Wasted like four hours of life trying to use it to flash some chips via ISP, and it just kept loving it up. Some cobbled-together version of avrdude + libusb managed to get the job done with no issues whatsoever. And that doesn't include the time it took to find some F/F jumpers and get the AVR Dragon rigged up for HVSP to un-gently caress bogus fuse settings AVR Studio wrote (bonus: unless I'm missing something, the current version of Studio is bugged and can't flash program code using HVSP, just fuses). I'm really beginning to hate everything remotely related to Atmel and the AVR. PICs 4 Lyfe

movax
Aug 30, 2008

Martytoof posted:

Probably a really frowned-upon opinion, but I'm starting to think everything I need to do I can just do on an arduino because I'm way too impatient to do low level stuff these days.

I'm actually kind of looking forward to this thing: http://www.sparkdevices.com/. Cortex M3 Arduino + WiFi onboard. Use Wiring or use the JTAG headers to write low level stuff to the chip.

Once you do enough "low-level" stuff, you generally end up with a library of code you can take from project to project anyway, which is basically what Arduino sits on (though admittedly their libraries are probably better tested simply by virtue of having a shitload of users).

That said as long as you don't turn into the typical Arduino user and understand the limitations/relative (dis)advantages of each approach, it's all good in my book.

e: not you specifically, duh

movax
Aug 30, 2008

Victor posted:

My last experience with AVR Studio was in 2003; it worked then, but a lot can change in ten years. Have you had issues using GCC tools and avrdude, or do you just prefer an IDE?

I'm partial to Visual Studio, which it apparently wraps around now, but no, I was OK doing the development with Notepad++ + gcc, that was fine (mostly because I didn't any crazy debugging for this). I was only using Studio to drive the AVR Dragon since I figured, hey, Atmel dev tool + Atmel software == everything works!

movax
Aug 30, 2008

Having quality libraries (ISRs and all) for common peripherals like I2C, SPI, UART, PWM, etc would be amazing, as well as still being able to utilize those with your own C code (vs being forced to use Processing, etc). That said, it's tough to monetize that (factoring in enforcing licenses and stuff); I think you've got a decent chance of engaging the quadcopter/drone community, especially if you could come up with some convincing/easy to understand PR to explain why your board may be more suitable than an Ardupilot.

movax
Aug 30, 2008

Kire posted:

Something I'm confused about with the ISRs on the MSP430 line of chips is if the chip only has one processor, the ISR has to be swapped in and the main program be swapped out for a while for the ISR to run, so it's not truly happening in the background like on a multi-threaded system, correct? So for this example of kicking the analog reads off to an ISR, that requires a multi-core or multi-threaded system to have any effect, right?

Are those terms multi-core and multi-threaded appropriate for talking about microcontrollers?

Some architectures maintain a shadow set of registers so that you don't have to waste time pushing/popping them on the stack, you literally have a different set to play with when you're in ISR land. Other things to never do in ISR land include making function calls (yeah...), large loops, etc. On chips with a DMA engine you don't even have to worry about the data transfer part, all your ISR will do is clear some interrupt flags at the expense of a cycle or two.

movax
Aug 30, 2008

armorer posted:

No reason, and thus far I haven't been using C++ since I don't really know it. It just seemed like C++ was more common from some of the stuff I've read. If it's a better idea to stick with C and just use structs, that's great - less to learn!

Yes, you definitely want to start out with C!

movax
Aug 30, 2008

Sinestro posted:

How do I learn to layout real PCBs.

Find slides and read a lot of app notes. Then, you'll have to parse all those documents yourself because they willconflict with each other when it comes to recommendations. Luckily the basics are not too difficult, and for most hobbyist class projects, even terrible PCB layout (usually) ends up working.

movax
Aug 30, 2008

SnoPuppy posted:

Unrelated, but I have a question about ARM development environments.

I'm going to be putting an ARM in a design for the first time, and I really don't want to spend time loving with tools. I haven't quite decided which part vendor I will use, although I'd like to go with a smaller M4/M4F.
I'm actually leaning towards an Atmel SAM4 not because of hardware reasons, but because it looks like they provide a free toolchain that's straight forward to install and setup.

Is there an ARM development/debugging environment that is free/low cost and reasonably easy to install and use?

I will say that I've never once regretted my decision to do all my ARM development in (separate) VMs.

movax
Aug 30, 2008

The absolutely cheapest way to get started in my opinion is the TI MSP430 LaunchPad. For the price of a Starbucks thingy, you get the programmer/debugger, the chip, loving everything.

C is not that hard! You can definitely make things blink with just a few lines, and it'll get you in the nice habit of reading datasheets to find the information you need. Once you know that, every subsequent processor is easy. It's reflexive: "hey, I need to wiggle some pins...I should look for a chapter called 'Ports' or 'Digital I/O', and find their equivalent of data direction reg, port data reg, and latch reg". Likewise for any hardware peripherals.

Did I mention that's its less than $5 and a few hours of your time? Well, OK, so add a few bucks for shipping. Still cheap as balls.

movax
Aug 30, 2008

armorer posted:

So I'm currently planning on getting this up and running on a Windows 7 box running AVR Studio 6, my Dragon, and an atmega1284-PU. Did you mention at some point that you are developing in a VM on OSX? If so, if I get it running without the VM in the mix, I'll have a go at it in a VMWare Fusion Windows 7 instance on my mac laptop. I am planning to keep the code super simple, and just set up an ISR to trigger on a clock timer or button press or something. I will probably just breadboard a minimal circuit to test it out. If I get through all of that, I also want to see what I can do on an atmega328p and an attiny85 with debugWire.

I don't know if you have any specific requests, but that's the stuff I have lying around.

I never had any issues running MPLAB (Microchip) in VMWare Fusion on my MBP; the debugger gets passed through directly, so there weren't ever any issues with performance or stability. They later came out with MPLAB X, which has an OSX native version, but shortly thereafter I just bought a ThinkPad because I can't deal with the MBP keyboard & layout slowing me down (I juggle 3-4 different layouts/OSes a day)

e: The Dragon+AVR Studio 6 sure as poo poo will waste 2 hours of your time loving up simple ISP programming when AVRDude+Dragon can get the same thing done instantly, albeit with some libusb fuckery

movax
Aug 30, 2008

Saleae Logic is worth it as well, though a bit priceier. New version should let you set adjustable threshold levels also.

movax
Aug 30, 2008

JawnV6 posted:

Place & Route is a nondeterministic process. Analyzing the entire scope is out of the question, so probabilistic techniques like simulated annealing are used. Slight changes to input logic can cause huge re-jiggering in the backend, so single-line changes often incur the whole compilation process and seemingly trivial changes can cause the process to take huge variations. If you're freaking out over 10 seconds, I am saying that this is a domain where such concerns will cause a great amount of frustration.

This has probably been a rumour for ages but I met a guy who used to work at Synopsys who was nuts; math major I think, but long story short, he worked on their synthesis tools and claims on multiple occasions his team was forced to bury work they did that sped up P&R to the point where Synopsys would be hosed as they couldn't make any more speed improvement and thus not sell upgrades.

Reeks a bit of bullshit, but the guy was smart as poo poo, and I'd tend to believe it, though I bet his tale was embellished somewhat.

Our current FPGA stuff takes ~8 minutes to compile on a single dual-core Westmere Xeon which isn't terrible. One of my larger designs was a good 15 minutes though :(

movax
Aug 30, 2008

Slanderer posted:

Is TI still horrible at the documentation for those things? I've been wanting to get a Beaglebone Black recently, since my original Beaglebone disappeared a long while back. I remember the documentation for the hardware being incomplete at the time, and not all of the basic drivers were done, which was cool.

I would say the actual documentation is decent, I always found the organization lacking. Maybe I'm weird but my favourite documentation is Microchip's, maybe because I used them first for everything.

Also Jawn/Victor no need to derail into a flame/snark war please.

movax
Aug 30, 2008

Krenzo posted:

I should have said I'm not new to embedded programming. I've been using Atmel microcontrollers up until now with Atmel Studio and programming them with a JTAG cable. I've played around with TI's Code Composer Studio with their Stellaris Cortex-M4 eval board but have not read anything about their ability to program external chips. I just now found that NXP's LPCXpresso can be used to program their own chips which is pretty much what I was looking for. I read that ST has their own programmer device, but the software support for it seems to be deficient. I will probably just go the LPCXpresso route.

If it is your first start into microcontrollers, I'd almost suggest against an ARM, to be honest.

It's kind of become a recent view of mine, after some talks/discussions with some guys from Freescale, NXP, etc. It stems from the fact that while eval boards might be usable, there's a lot of gotchas / things to be careful of when transplanting it to a PCB of your own design (which is something attainable for hobbyists, EAGLE + OSH Park) because the clocks / edge rates can be quick enough that haphazard (hobbyist) PCB design doesn't cut it. Poor power delivery, bad grounding / coupling can manifest in random instabilities / performance issues that would frustrate the hell out of a guy who just wanted his own board.

While some of the families have decent toolchains and cheap eval boards, the chips themselves are fairly complex and not everyone has good sets of libraries that effectively abstract / wrap that functionality. Multiple clock domains (not unique to ARM I guess, even baby PICs/AVRs have them), many different power states, existence of a boot ROM, complex interrupt controllers etc require a bunch of reading/research in the datasheet.

If you can get away with it, using a simpler 8-bit/16-bit micro (tinyAVR, PIC, MSP430, etc) I think you will learn more and the amount of material to understand in terms of architecture and peripherals is more manageable. They are also fairly forgiving when it comes to PCB layout, so it should be an easier time translating said design to a PCB of your design.

movax
Aug 30, 2008

Rescue Toaster posted:

Anybody do a lot of work on both AVR and PIC? I've used AVR's for ages but I'm looking at getting into some PIC stuff too. Mainly because of the dsPIC33 series which has some pretty amazing features and performance at pretty great discounts compared to what AVR offers. In my particular case the models that have a CAN bus controller integrated are *dramatically* better than the AVR CAN models that cost significantly more.

I really like the way the PIC compiler supports bitfields for structs and IO registers, it's so much nicer than the AVR bit manipulation & automatic type-promotion shitshow. But the lack of optimization on the free XC16 compiler is pretty terrible. Having non-free toolchains seems so totally ridiculous these days.

Any other caveats or gotchas when transitioning from AVR to PIC? Or any other advice/resources/tutorials anyone can recommend to someone who knows microcontroller development but just wants to learn 16-bit PIC specifically?

I've done probably a dozen PIC projects/products so far; I like them and their tools better than AVR in every way (though I also started with PICs). I used a dsPIC with its integrated CAN controller + DMA engine pretty easily (1 hour to get up and running) for a project, no major issues.

The Real ICE is the best tool they offer, but it's pretty expensive; you'll probably be fine with a PICKit 3.

movax
Aug 30, 2008

Rescue Toaster posted:

Thanks. Sounds promising. For my purposes & limited run size I can always just go up a flash size if the optimization is a problem. Plus being able to prototype the CAN chips in DIP is really nice.

Also the DMA not being a giant pain to use is awesome.


EDIT: Microchip gets bizarrely selective about what pins have PPS and what are hard wired, and which are available.
For the dsPIC33EP/MC50x series that I was looking at, there's no PPS support for CTS/RTS flow control... unlike every single other 16-bit pic I've looked at so far.
The 28 pin ones have no assigned CTS/RTS pins, so flow control is unavailable.
The 44 pin TQFP has an RTS pin, but no CTS pin (seriously wtf?).
Only the 0.5mm 64 pin versions that cost as much as all the higher end dsPIC's anyway have both pins (still no PPS).

Yeah, it's all kind of crazy hosed up, they must have piles and piles of RTL around they just throw together. At least the dsPICs have a reasonable breakdown in terms of GP/MC/etc. PPS is more common on the PIC24Fs I've found and only the lower pin count ones at that.

Adbot
ADBOT LOVES YOU

movax
Aug 30, 2008

isr posted:

microchip.com/microstick , check out the microstick II. Its what I use for quick prototyping, it comes with a few 28-Pin DIP PIC24s, and a PIC32. You can program/debug/etc with it pretty easily. There are some making GBS threads things about it, like they didn't bring out VDD/VSS to the pins so you'll have to solder some wires if you want to do certain things.

I agree that XC16 sucks, their compilers all suck. I've heard very few copies actually get sold. The 16 bit PICs are really easy to program in ASM. If you want to learn 16-bit PICs, learn to program them in ASM. Then, when the compiler gives you poo poo, (it will!) you'll be able to find it faster.

I've had pretty good experiences with the free version of XC16 but I fully admit I'm not at very high device utilization / large complexity projects. Will agree that programming them in ASM is pretty easy though.

  • Locked thread