Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
feedmegin
Jul 30, 2008

Luigi Thirty posted:

Okay cool. The system has boot ROM that basically jumps to user code via the interrupt table and I think I've got it working.

GAS blows though. I like VASM since it has Motorola syntax instead of AT&T syntax. Is there a way I can mix binary files from VASM with GCC's output? VASM can output just about any object format. I want to end up with an ELF from my linker which I turn into an S-record.

Just generate ELF .o files then? Should pretty much Just Work.

Adbot
ADBOT LOVES YOU

hendersa
Sep 17, 2006

Luigi Thirty posted:

Okay cool. The system has boot ROM that basically jumps to user code via the interrupt table and I think I've got it working.

GAS blows though. I like VASM since it has Motorola syntax instead of AT&T syntax. Is there a way I can mix binary files from VASM with GCC's output? VASM can output just about any object format. I want to end up with an ELF from my linker which I turn into an S-record.

ELF? I haven't messed around with m68k in a long time, but wouldn't you be using a.out, instead? I suspect that an .o emitted from VASM will link with GCC-compiled .o files without an issue, unless there is a symbol mismatch of some sort when you are calling asm from C and vice-versa. You're handling the stack setup/teardown for parameter passing manually in the asm, so just make sure it matches up with the stack convention of the GCC-generated stuff and you'll be fine.

Luigi Thirty
Apr 30, 2006

Emergency confection port.

Thanks, I got it booting and running a simple C program. I'll see if I can set up Newlib and get something a little more complex going.

Luigi Thirty
Apr 30, 2006

Emergency confection port.

I'm running into a problem with the linker. I need to set up the vector table at 0x10000 and some configuration info at 0x10080. The program code starts at 0x10100.

code:
cseg

VectorTable:

VEC_RESET:		;$10000
    JMP     HANDLER_RESET

etc etc
And then I have this linker script:

code:
MEMORY
{
	rom (rx)		: ORIGIN = 0x010000, LENGTH = 32K
	ram (!rx)		: ORIGIN = 0x400000, LENGTH = 8K
}

SECTIONS
{
	. = 0x10000;
	CODE.VectorTable : { *(CODE) }

	. = 0x10080;
	CODE.Configuration : { *(CODE) }
	
	. = 0x10100;
	CODE : { *(CODE) }
	.text : { *(.text) }
	
	. = 0x400000;
	.data : { *(.data) }
	DATA : { *(DATA) }
	.bss : { *(.bss) }
	BSS : { *(BSS) }
}
But VectorTable ends up starting at 0x10100. There's nothing at 0x10080. Since it comes from a Devpac syntax assembler it's all in a segment named CODE. Any suggestions?

hendersa
Sep 17, 2006

Luigi Thirty posted:

I'm running into a problem with the linker. I need to set up the vector table at 0x10000 and some configuration info at 0x10080. The program code starts at 0x10100.

code:
cseg

VectorTable:

VEC_RESET:		;$10000
    JMP     HANDLER_RESET

etc etc
And then I have this linker script:

code:
MEMORY
{
	rom (rx)		: ORIGIN = 0x010000, LENGTH = 32K
	ram (!rx)		: ORIGIN = 0x400000, LENGTH = 8K
}

SECTIONS
{
	. = 0x10000;
	CODE.VectorTable : { *(CODE) }

	. = 0x10080;
	CODE.Configuration : { *(CODE) }
	
	. = 0x10100;
	CODE : { *(CODE) }
	.text : { *(.text) }
	
	. = 0x400000;
	.data : { *(.data) }
	DATA : { *(DATA) }
	.bss : { *(.bss) }
	BSS : { *(BSS) }
}
But VectorTable ends up starting at 0x10100. There's nothing at 0x10080. Since it comes from a Devpac syntax assembler it's all in a segment named CODE. Any suggestions?
Your code is going to be located in .text. You are telling the linker to begin placing .text code at 0x10100, which is why your vector is being placed there.

One option is to create a new memory section (like MEMORY.vector) at 0x10000, have MEMORY.rom start at 0x10100, and then tell your assembler to place the code in the MEMORY.vector section. It doesn't sound like your assembler does that, though. With GCC, you do something like this in C code:

code:
__attribute__((section(".vector.ENTRIES"))) const uint16_t vector[16] = {...};
... with linker script stuff like:

code:

.vector :
{
*(.vector.ENTRIES)
} > VECTOR

MEMORY
{
        VECTOR (rx)		: ORIGIN = 0x010000, LENGTH = 256 
	rom (rx)		: ORIGIN = 0x010100, LENGTH = 32512
	ram (!rx)		: ORIGIN = 0x400000, LENGTH = 8K
}
Alternatively, you could just sprinkle some .org directives around the asm and try to push things around to the address you want them in within the MEMORY.rom space and set .text to start at 0x010000. I prefer to explicitly position things by name if they absolutely have to be in a particular location, but if you can position things at explicit offsets from the start of the section and then explicitly position the section origin, that should work, too.

Luigi Thirty
Apr 30, 2006

Emergency confection port.

I got illegal relocation errors on my jmp _main line from my assembler when I tried adding org directives. I think I'll just switch to GAS so I can stay with the entire GNU toolchain even if it's harder to use.

The reason why VASM spits out the weird section names is it's imitating an Amiga assembler which I'm used to.

whose tuggin
Nov 6, 2009

by Hand Knit
Are the majority of you professional embedded programmers Computer Engineers?

JawnV6
Jul 4, 2004

So hot ...
You're asking that in a very leading way.

Is "Computer Engineering" an ABET accredited major at a degree-granting institution? Why's it capitalized?

csammis
Aug 26, 2003

Mental Institution
My degree was in computer science which at the time and at that university was basically a math/statistics degree. I suspect I'm an outlier though, at least where I work.

JawnV6 posted:

Is "Computer Engineering" an ABET accredited major at a degree-granting institution?

Is that uncommon? "Computer Engineering" is the name of an ABET accredited undergraduate program where I went to school :confused:

csammis fucked around with this message at 02:42 on May 19, 2017

Star War Sex Parrot
Oct 2, 2003

The Scientist posted:

Are the majority of you professional embedded programmers Computer Engineers?
Hard to generalize that sort of stuff, since electrical and computing degrees have varying curriculums depending on the institution. I'll try though: my suspicion is that more embedded folks find their way into it from the EE side, rather than the CS end of the spectrum.

Here at WD/HGST/SanDisk it seems like most of the firmware engineers come from the EE side with some CpE sprinkled in. The CS folks tend to land elsewhere in the company, but that's just my experience. There's definitely nothing stopping CS people from moving to embedded if that's their inclination.

Star War Sex Parrot fucked around with this message at 02:42 on May 19, 2017

Star War Sex Parrot
Oct 2, 2003

JawnV6 posted:

Is "Computer Engineering" an ABET accredited major at a degree-granting institution? Why's it capitalized?
He's studying CpE at Clemson, so yes.

JawnV6
Jul 4, 2004

So hot ...
Right, it's clear he's asking in the context of degrees but it could just be "Are Embeddeds do Computer Engineer" wholly separate from that somehow? I had the choice between CS, EE, and "Computer Systems Engineering." Went to CS prof to talk through the options. I went with EE specifically because neither CS nor CSE wasn't certified, but that honestly hasn't impacted my career one way or the other.

I did take a lot of CS classes, nearly qualified for a minor. Took even more after I graduated to fill in the gaps between HLL's and the hardware I was working on.

Star War Sex Parrot
Oct 2, 2003

JawnV6 posted:

I went with EE specifically because neither CS nor CSE wasn't certified, but that honestly hasn't impacted my career one way or the other.
ABET accreditation isn't everything. Some of the higher-ranked (which also isn't everything) institutions don't bother dealing with ABET because they don't want to modify their curriculum to satisfy the sometimes arbitrary requirements. If the school's got a good reputation, no one's ever going to ask "is that ABET accredited?"

carticket
Jun 28, 2005

white and gold.

My title is Software Engineer at work, but all of us are embedded guys. At my last job, Embedded Software Engineer was my title.

whose tuggin
Nov 6, 2009

by Hand Knit

JawnV6 posted:

You're asking that in a very leading way.

Is "Computer Engineering" an ABET accredited major at a degree-granting institution? Why's it capitalized?

Sorry, in the microcosm of school, we always take for granted that someone studying mechanical engineering *is* a mechanical engineer... It never occurs to me that frequently in the real world, job titles include the term "engineer" and that this is distinct from degrees in engineering. In some sense you are all, by definition Computer Engineers, I suppose.

Star War Sex Parrot posted:

He's studying CpE at Clemson, so yes.

This is correct :cheers:



Im just trying to get a feel for the roles of people with degrees in CpE in the world. I get the sense that their role ranges from embedded programming to the actual design of the hardware that the embedded software runs on.

I'm pretty convinced this major is for me, I just got finished with a course called "system programming in UNIX with C" and low-level programming in C and C++ is my Favorite Thing. I think the point of the class being in Linux(UNIX) - and us learning to use gcc and gdb - is that these are the tools we will be using to interface with hardware (this really makes sense to me when I look at the documentation for gcc and see all the compiler options and architectures it supports). Has this been you all's experience?

Does anyone use Clang and/or valgrind much?

Hunter2 Thompson
Feb 3, 2005

Ramrod XTreme
I majored in computer engineering at my school which used to be ABET accredited for CE. They decided not to renew a few years after I graduated, which is kind of weird.

Edit: Yes to all three of your questions. However there are proprietary toolchains out there like IAR and Keil you'll run into.

Hunter2 Thompson fucked around with this message at 05:09 on May 19, 2017

Star War Sex Parrot
Oct 2, 2003

The Scientist posted:

I just got finished with a course called "system programming in UNIX with C"
Out of curiosity, what textbook (if any) did they use?

carticket
Jun 28, 2005

white and gold.

Star War Sex Parrot posted:

Out of curiosity, what textbook (if any) did they use?

I feel like that is a course that would use the K&R Bible as a textbook based on the name alone.

carticket
Jun 28, 2005

white and gold.

The Scientist posted:




Im just trying to get a feel for the roles of people with degrees in CpE in the world. I get the sense that their role ranges from embedded programming to the actual design of the hardware that the embedded software runs on.

My experience is that most people leaving school with CpE will go into the software side. Few go into the hardware side from CpE (though some do). The third prong is FPGA and ASIC design. If I had to guess percentages, I'd say 60% write software as their primary role, 30% write FPGA/ASIC code as their primary role, and 10% do electronics design as their primary role.

whose tuggin
Nov 6, 2009

by Hand Knit

Star War Sex Parrot posted:

Out of curiosity, what textbook (if any) did they use?

System Programming With C and Unix by Adam Hoover

The author, Adam Hoover, is a professor at Clemson, so he agreed to have the .pdf version of the book distributed for free. I thought it was a really interesting book because it introduced concepts other than just C syntax, like BASH scripting, Perl, and Linux fundamentals in a way that I found really accessible. Though I suspect it'd be beneath most of the people ITT.

A week after the class had ended, I took the book to the beach a couple of times and finished it in my free time.

movax
Aug 30, 2008

Most computer engineers I've met are either embedded software, FPGA engineers or ASIC / digital logic designers (comp arch basically). Their electrical engineering abilities generally vary on how much they like that stuff / how well they did on that in school, from being utterly useless outside of the software realm to being able to do their own boards / circuit design for their MCUs / FPGAs.

Personally I majored in both, because I could get the credits to line up easily and I'm comfortable living across the boundary doing low-level software work, IC design, PCBs or system architecture. Try out each area until you find the one you like, or keep moving around because that's how you stay sharp / up-to-date.

feedmegin
Jul 30, 2008

The Scientist posted:

Are the majority of you professional embedded programmers Computer Engineers?

My degree is in History :shobon:

carticket
Jun 28, 2005

white and gold.

feedmegin posted:

My degree is in History :shobon:

So you stick to the oldies like Fortran and COBOL?

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
If I want one-time interrupts on DMA transfers on a STM32F4, I should be disabling the transfer complete interrupt within the ISR, as to not run into issues when queuing a new one? The recommendation is to explicitly disable the DMA stream before starting a new transfer, AFAIK, which seems to trigger another transfer complete interrupt just out of principle, whether there was an active transfer or not.

Because I'm having a bunch of issues with SPI DMA. When I start a DMA transfer and just sit there waiting in a while-loop for TCIF, it works just fine. But the mere fact of enabling interrupts with a handler that does nothing, breaks things and appears to have said while-loop not terminate.

carticket
Jun 28, 2005

white and gold.

If you're using an empty handler, you're probably not clearing the TCIF and just infinitely interrupting.

For interrupt driven, I'd mask the interrupt, clear the interrupt flag, disable the channel, and signal the process, in that order.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
But if I don't clear the TCIF, shouldn't the loop that runs waiting for it get triggered eventually? Or does the interrupt keep getting called when the flag isn't cleared within it? --edit: By that I mean, it exits the routine, the MCU see the flag is still set and immediately calls it again.

carticket
Jun 28, 2005

white and gold.

Combat Pretzel posted:

But if I don't clear the TCIF, shouldn't the loop that runs waiting for it get triggered eventually? Or does the interrupt keep getting called when the flag isn't cleared within it? --edit: By that I mean, it exits the routine, the MCU see the flag is still set and immediately calls it again.

Typically you need to take some deliberate action in the interrupt to clear the flag, otherwise it will exit the interrupt back to the start of the interrupt (the Cortex M3 tail chains interrupts, so it will basically loop your interrupt over and over). You'll have to check the reference manual. Sometimes you have to write to a register to clear a flag, sometimes reading the status register will clear the flag. It should tell you if you look up the flag register what will trigger clearing each flag.

Volguus
Mar 3, 2009
I am working with a Synopsys ARC board which will have to communicate with an Android application. Since i have a lot of memory on the board, and I have FreeRT OS and network stack and all, I wrote essentially an TCP server that can communicate with the android application, following my own protocol. And it ... works, as a basic thing. However, I would like (if possible) to avoid manually serializing my data structures as I'm doing now (sending doubles over the network and different languages/architectures is a peach). Is there any serialization library out there that I could use in my embedded system that would be able to read/write to a Java program? I looked at msgpack, but there are only 3rd party and (they say) not very reliable embedded implementations.

What are people using for this?

muon
Sep 13, 2008

by Reene

Volguus posted:

I am working with a Synopsys ARC board which will have to communicate with an Android application. Since i have a lot of memory on the board, and I have FreeRT OS and network stack and all, I wrote essentially an TCP server that can communicate with the android application, following my own protocol. And it ... works, as a basic thing. However, I would like (if possible) to avoid manually serializing my data structures as I'm doing now (sending doubles over the network and different languages/architectures is a peach). Is there any serialization library out there that I could use in my embedded system that would be able to read/write to a Java program? I looked at msgpack, but there are only 3rd party and (they say) not very reliable embedded implementations.

What are people using for this?

Protobufs!

Volguus
Mar 3, 2009

muon posted:

Protobufs!

For embedded?

feedmegin
Jul 30, 2008

Volguus posted:

However, I would like (if possible) to avoid manually serializing my data structures as I'm doing now (sending doubles over the network and different languages/architectures is a peach).

Good news, 64-bit IEEE floating point is standard absolutely everywhere these days unless you're, I dunno, talking to a frigging VAX or something; the only thing you've got to worry about is endianness.

Volguus
Mar 3, 2009

feedmegin posted:

Good news, 64-bit IEEE floating point is standard absolutely everywhere these days unless you're, I dunno, talking to a frigging VAX or something; the only thing you've got to worry about is endianness.

Yes, endianess was the only thing i worried about so far. Good to know that I shouldn't care about anything else though :).

JawnV6
Jul 4, 2004

So hot ...

Volguus posted:

For embedded?
Yeah? Sure, spend some time hand-rolling a serialization protocol in two separate languages in 2017 if you want to, but there are plenty of bindings for a variety of formats for different use cases and languages available.

nanopb for protobufs in C, capnproto is the same guy who did protobufs that does a compact in-memory format suitable for slamming out over a wire, and CoAP is a good fit if other parts of the system are REST.

But sure, embedded serialization is a special snowflake that nobody's tended to.

Volguus
Mar 3, 2009

JawnV6 posted:

Yeah? Sure, spend some time hand-rolling a serialization protocol in two separate languages in 2017 if you want to, but there are plenty of bindings for a variety of formats for different use cases and languages available.

nanopb for protobufs in C, capnproto is the same guy who did protobufs that does a compact in-memory format suitable for slamming out over a wire, and CoAP is a good fit if other parts of the system are REST.

But sure, embedded serialization is a special snowflake that nobody's tended to.

The entire reason why I asked the question is because I imagined that my problem is already solved. I thought (wrongly) that maybe protobufs were not quite designed for lovely CPUs. Is nanopb the recommended protobuf implementation for embedded? What are you using?

carticket
Jun 28, 2005

white and gold.

At work, I'm trying to design an IPC system for a master processor (still a microcontroller) to control a bunch of modules that may be connected by I2C, UART, or potentially in the master. It's a platform that we're going to use for several products, thus the different connectivity options.

I've never used protobuffers. Is that something I could use for the messaging, and just wrap them up in a transport layer?

JawnV6
Jul 4, 2004

So hot ...

Volguus posted:

The entire reason why I asked the question is because I imagined that my problem is already solved. I thought (wrongly) that maybe protobufs were not quite designed for lovely CPUs. Is nanopb the recommended protobuf implementation for embedded? What are you using?
I used nanopb at my last company, the server side had a scala library that inflated it to something native. We weren't really compute performance constrained (100+MHz Cortex-M4) so I can't speak to that. And if you genuinely have ~12 bytes you're trying to get across a channel that you're writing the other side of and have a PoC up and running, it might not be worth it. But it reduced the data by a surprising amount. The guts are worth a brief scan, i.e. it condenses uint32_t's into 8 bits if the value is small enough. That does make the size dependent on the content, which may or may not pose a problem.

It takes the .proto and spits out generated code. I had to fill in a few callbacks to populate the data. After that it was a binary blob, I'd fill in a flash page with as many as would fit and kick them up to the server. No other framing, stack blobs and shoot them out.

Mr. Powers posted:

At work, I'm trying to design an IPC system for a master processor (still a microcontroller) to control a bunch of modules that may be connected by I2C, UART, or potentially in the master. It's a platform that we're going to use for several products, thus the different connectivity options.

I've never used protobuffers. Is that something I could use for the messaging, and just wrap them up in a transport layer?
It's a format designed for quickly passing compact structures around servers. It's designed in a way that you can consume it without having to know all the details of every field so a server a revision behind can pick of what it understands and pass the rest along.

That said it might be a little odd for something as slow/constrained as I2C? I never had to unpack them on the embedded side, just encode sensor data.

Volguus
Mar 3, 2009

JawnV6 posted:

I used nanopb at my last company, the server side had a scala library that inflated it to something native. We weren't really compute performance constrained (100+MHz Cortex-M4) so I can't speak to that. And if you genuinely have ~12 bytes you're trying to get across a channel that you're writing the other side of and have a PoC up and running, it might not be worth it. But it reduced the data by a surprising amount. The guts are worth a brief scan, i.e. it condenses uint32_t's into 8 bits if the value is small enough. That does make the size dependent on the content, which may or may not pose a problem.

It takes the .proto and spits out generated code. I had to fill in a few callbacks to populate the data. After that it was a binary blob, I'd fill in a flash page with as many as would fit and kick them up to the server. No other framing, stack blobs and shoot them out.

Thanks for the nanopb suggestion. Got it up and running on both my dev linux machine and on the board (i am really lucky, they implement almost everything in the POSIX specification. Only a couple of #ifdef's were needed). While the data and the amount of it that I'm sending right now is quite trivial (X*12 bytes, where X is variable), i do expect it to become more complex in the future. And not having to worry about endianess, protocol itself and everything related to it is surely a bonus.

Phobeste
Apr 9, 2006

never, like, count out Touchdown Tom, man
The only thing that annoys me sometimes about protobufs is the variable length integer encoding which can get pretty annoying if you don't want to be doing dynamic framing, specifically because it's possible that the variable length encoding of a number can be larger than the number's native side - varints built from uint32s can be as small as one byte or as large as five, which is pretty annoying.

Other than that they're really good though.

minidracula
Dec 22, 2007

boo woo boo
Hello thread.

Long time no see.

How you doin'?

If any embedded dev goons are looking for a job in the Seattle area, or are willing to consider relocating, let's chat. Feel free to PM me here with an email address I can reach you at directly.

Adbot
ADBOT LOVES YOU

Popete
Oct 6, 2009

This will make sure you don't suggest to the KDz
That he should grow greens instead of crushing on MCs

Grimey Drawer

Mr. Powers posted:

My title is Software Engineer at work, but all of us are embedded guys. At my last job, Embedded Software Engineer was my title.

Are you me? My last job I was an Embedded Software Engineer and at my current job I'm a Software Engineer, both are embedded software work.

I have a Computer Engineering degree, it does seem to be CmpE and EE make up embedded software people more than strictly CS probably because CmpE embedded classes usually fall under the EE side of the degree. I originally went into CmpE because I wanted to work with silicon doing VLSI work but that usually requires an advanced degree, I then wanted to do ASIC/FPGA work but I took an internship doing embedded software and fell in love with it and that's what I've been doing since graduating. I initially never thought I would wind up in software because I found high level software programming (even Java) boring as hell until I discovered firmware/embedded.

  • Locked thread