Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR
This is more of an editing question but I can't think of a better place to ask it than here:

I want to put together a stinger, or bumper, for my YouTube videos (mostly planned DJ mixes, but hopefully some other content like tutorials or live videos in the future). I have a still image I've been using, you can see it here - I think it comes from the dorsal SONAR on a submarine - but I want to animate it. I like 'glitchy' effects, and also stuff that appears audio-reactive, like when the image zooms to appear to 'bump' to a big kickdrum, or shake radially to a sustained bass swell. So, what software (and plugins?) should I be working in? I have Premiere and After Effects, and I only know a little bit about the former, nothing about the latter. I know the 'glitchy' stuff and other effects described above are relatively simple things to pull off but haven't any idea where to start. Are there some tutorials that would help me to this end? And also, an important consideration...

The source image files I have for what I'm looking at are... not high-res. If I recall correctly, in order to get that still in 720p I had to blow the image up and apply a couple of blur effects to it, which still look fine because of the content. I'm planning on shooting and possibly cutting these upcoming videos in 4K - although maybe not, they're just DJ mixes after all and the few I've already posted in other locations look fine in 1080p. Given that I'm looking to 'glitch things up' I don't think it's a huge deal that the source image isn't the highest quality but I'm interested in hearing how make the distortion look like part of the aesthetic, and not just like a lazily-blown-up-and-animated image.

Adbot
ADBOT LOVES YOU

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR
Daaamn, thanks for the prompt reply! I'll definitely check out that video.

frytechnician posted:

As someone who is literally a full-time editor I do want to just quickly give you a heads up and this is ONE MANS OPINION ONLY so take it for what you will: Glitch, zoom, and uber-flashy transitions are massively overused and sometimes clean, minimal animation is best. This is, of course, your call and I don't want to come across as a dick.

Oh for sure, even just as a casual viewer I can tell
t̻͉̲͇͇h͍̗͘e̩̝̫̰̙s̹e̮͘ ́s̸̻͔͙̦̤o͖̩͈̣͚̳r̟̭͖͔̠ͅt̴̹̟̼ͅ ͚̠͍͇͞o͚̘̜͝f̢̙̩̻͖̱ͅ ̗̰̖̗͎̳̭t̝͓̣h̰̱̼ḭ͍͉͘n̶̫̙͙̘̥̜̺g̢̥̙̭̻̘̜s͉̬̬͓̺ͅ are overused. Given that I'm just looking to create a video bumper I'm not too worried about doing cliché stuff, necessarily. It also happens that one other hobby video project I'm looking to shoot and edit soon is a promo video for an arcade bar, so I'm doubly interested in the (also cliché) world of VHS-lookin', scanline-havin', colour-separatin' noisy synthwave stuff.

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR
Gear question I can't think of a better place to ask:

I've got a pair of Sony action cams that I'm trying to use as part of a livestreaming setup. They're piped into a BlackMagic ATEM Mini. I just started playing around with angles today to find that both the action cams' outputs are a bit... flickery, for lack of a better term. Their output goes to black every few seconds.

I've tried messing with settings on the cameras (thinking maybe it was a framerate discrepancy causing dropouts or something) to no avail. Best explanation I've got is the HDMI connection on the back. The cameras don't do have this problem when they're recording internally, it seems more pronounced when the cameras or cables are vibrating, and I can physically hold them in certain ways/angles that minimizes the output flickering.

Now I'm hoping the problem is just a couple of cheap loose Amazon micro-HDMI cables, because otherwise (the actual outputs on the cameras being loose) I'm SOL here. They're for recording DJ sets, so naturally there's going to be quite a bit of vibration. Thoughts on this? Thanks!

While I'm here, is there a livestreaming thread more appropriate for further questions on the matter?

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR
I have an amateur-ish question about timecode and clock sources: What typically generates the clock on a film set? I have a background in audio post-production and we did a small unit on location sound; I seem to recall learning that the sound recordist's gear is usually what everything else is slaved (btw what are we changing this nomenclature to?) to, but I'm not sure.

The reason I ask is that I've finally gotten off my rear end and learned the Timecode Systems wireless ecosystem. I've got a Zoom F8n recorder, a Denecke TS-C slate and an Atomos Ninja V monitor. The slate and recorder have TCS UltraSync Ones attached and the Ninja has the AtomX Sync module which operates in the TCS ecosystem. I got them all locked together last night, but...

The slate is the master. I want to use the Ninja as the master (DJ livestreaming is all I'm really doing right now and I'm just using the slate/timecode to begin learning it and because it looks snazzy). This is fine for now - I think the slate is actually the most reliable clock I have - but it got me wondering how this could possibly work on a set, given that it's battery-operated and I don't think it has a backup while swapping packs of AAs. Does the slate stay on, running and generating clock or is it switched off after a take is marked? I understand the phrase 'jam sync' probably factors into the answer somewhere, and I get that it's generally OK to sync a few devices and let them run free as drift can be negligible until you start stacking up the hours, but I don't get how the re-synchronization would work.

Thanks for your time!

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR

CaptainViolence posted:

timecode stuff


Awesome, thanks for clearing that up!

I guess my next question is then, how come I can't seem to get the slate to receive TC? The Ultrasync Ones and Atomos unit are locked together no problem, the sound recorder and monitor both send and receive just fine, but the slate doesn't want to play ball unless it's the master. I THINK I have the issue down to an incorrect cable.

When I got the Ultrasync Ones from Trew, I had them make up some adaptor cables, since their I/O is those tiny locking DIN 1.0/2.3 connectors but the other devices are either BNC or 5-pin Lemo. Now I'm not sure if this is a standardized thing, and maybe I asked and forgot, but these cables are colour-coded: Some have a red heatshrink collar, others have blue. If this is a standard thing, what is the significance beyond different pin configuration? The one connecting the slate to its Ultrasync One (LTC port on the sync unit) is red, and as I said it only seems to work if the slate is generating TC, I cannot find a setting on the slate that allows it to receive. Am I correct in having troubleshot this issue down to the wrong cable, do I need to get another one made up?

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR
Yeah, I've tried all the settings on the smart slate (obvi the 'generate/read' switch, also the small clock-switch that changes generated TC rate) to no avail. Generates no problem, still doesn't want to play ball with any other clock source. But I haven't tried using the 1/4" input - Trew didn't give me a cable with a 1/4" terminal, just the DIN-to-Lemo one. I'll swing by them sometime this week with the slate and ask. Thanks for your help!

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR
That's what I meant by this:

Mister Speaker posted:

also the small clock-switch that changes generated TC rate

I forgot about the word 'dial', lol. In any case it has an effect on the clock the slate generates, but so far none on helping it receive external timecode.

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR
I tried that, slate still doesn't seem to want to receive TC. It's pretty much brand-new. I've been putting off going to Trew to see what they have to say about the cable.

Another Q for you guys regarding framerates: So as I think I mentioned, I'm using a trio of Sony action cams piped through an ATEM Mini into my laptop for streaming, and an Atomos recorder for... recording. The last DJ set I recorded went very well, but the recorder was rolling at 60fps (lol now I feel even sillier about these slate issues). It was HOT when I finished the recording, and the file size (about an hour) was like 193GB, lol. I can't seem to find a setting in the recorder to force it down to a more reasonable 25fps...

If I'm reading this article correctly, this problem is actually something to do with the ATEM and the cameras. I can't really make heads or tails of it; all three cameras are set to 25p but they're still imaging and outputting at 60fps? I've looked in the ATEM software and all three camera framerates in the Cameras tab show 1/25, but no matter what I do the Atomos recorder still sees its input at 60.

Mister Speaker fucked around with this message at 19:55 on Aug 4, 2020

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR

VoodooXT posted:

It sounds like the cameras, while shooting at 25 fps, are probably outputting 59.94i with the 25fps pulled down.

That's what I gathered from the article but I'm having a hard time wrapping my head around it. Is there a fix, or will the recorder always be seeing 60fps (59.94i) as long as these cameras are connected to it?

I solved my slate issue! It was, indeed, the cable. Red shrinkwrap denotes signal flowing from the Lemo connector to the mini-DIN; green indicates the other way around. The guy at Trew had a good laugh with me about how they should probably just use drat arrow symbols. Anyway they had one green cable so I went out and bought it, and I think it's the most expensive cable I've ever bought at $93 for 18". But it works!

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR

VoodooXT posted:

It's gonna record as 59.94i and in post, you can attempt to remove the pulldown. You might have some luck with Resolve since I was on a project that shot with a camcorder that shot 24p but with pulldown to get 59.94i and they managed to get it to pure 24p with Resolve.

Hmm, OK. I wish this was something I could solve at the switcher or recorder level. The raw video at 60fps is huge but that's not too big an issue, I'm more concerned about damaging my gear from overheating - the SSD and recorder itself were quite warm, hot even, after an hour-long DJ set last Sunday. Should I be worried about this?

As for removing the pulldown using something like Resolve, is this absolutely necessary? I'm relatively a layman (as I'm sure you can tell) but I didn't notice anything bad when I simply imported the 60fps file directly into Premiere Pro and set the export settings to a more reasonable framerate for the YouTube upload.

Lastly - to tie this all together with the other TC issue I was having - what effect does this have on the timecode I'm actually seeing on the Ninja's screen and my slate (and any other connected devices)? I think it's still rolling at 25 or 30fps, but is there going to be a discrepancy created by the cameras' 69.94/25-pulldown that makes the TC I'm scrubbing through all out of whack? Sorry if I'm not wording this in concise or accurate terms, I'm still very amateur at this stuff. Thanks again for all your help.

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR
Can anyone tell me what this cursor icon in Premiere means?



I'm guessing it's 'still loading media'. It appears when I try to drop some very sizeable video clips into the timeline. This one's about 97GB; it's 360 camera footage from a motorcycle ride. Should I wait this out or is it even feasible to work with clips this size?

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR

Aix posted:

thats the insert / overwrite icon, have a look at the buttons to the right of your play-button in the preview window. its not going away. any footage thats still being worked on will just show the yellow multilingual media en attente screen

putting huge files in the timeline shouldnt be that big of a deal, tho id prefer to just clip the parts i actually need via three point editing

Thanks! I'm going to familiarize myself with three-point editing today and pull some of the highlights from the ride. Ultimately what I'm trying to do is make a short music video set to footage from various sources (360 cam, drone, action cams, still photos) that I shot at the cottage this summer. Using that to cut my teeth on basic editing and do more in a similar vein later. I'm trying to work out the workflow in my head because, as I said some of these clips are large. Should I create a session with all the large clips and cut highlights out of them to assemble in a new session? When I've done similar work in the audio realm (e.g: cutting a long take of location sound into clips, topping/tailing, EQing etc.) most of the source clips have stayed in the same session, just at the end of the timeline a ways away from the work area. Is this acceptable in Premiere as well or are there better practices?

Also, with regard to effect automation/keyframes... If I automate some parameters in a larger clip, and then copy and paste a piece of that clip, does the automation follow with the copy? Thanks again.

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR

powderific posted:

Do you mean sequence? I usually keep everything I’m not using in a separate sequence, but if you aren’t doing it for someone else I’m not sure it matters. How large is large? If these are multi hour files and you’re doing like a minutes long edit I’d probably cut them down to selects in one sequence and then pull from that.

The 360 footage is about an hour and 40 minutes, give or take. I also have some 4K footage about the same length from a handful of Sony FDR-X3000 action cams that are mounted on my bike and helmet. I usually take a few hours in an afternoon and do loops around my local highway off/onramps for the footage, but it'd be the same deal if I went on a nice ride through some forest twisties up North: Some long-rear end captures because I'm not able to hit stop and record every few minutes.

Thanks for the timeline/sequence suggestions guys. I figured for this amateur application it wouldn't really matter where the raw files are too much. I've got another semi-related question about the 360 footage...

GoPro's 360 stitching software is called Fusion Studio. I use this to render useable video from the 360 camera; it will do all the fun 'straight to social media' presets like 'little planet' and panoramic photography and all that crap, but I simply render the raw 360 files stitched to bring into Premiere (where the GoPro VR plugin is applied to them to pan & scan and otherwise treat them like a traditional camera angle). Fusion Studio has several options for output resolution on rendering - one of these is 5.2K, and would make the aforementioned 1:40-long video about 350GB in size. It also crashes every time I try to render in 5.2K, no matter the file size. Someone in the Mac Hardware thread suggested this may be a limitation of my older CPUs (a pair of 3.46GHz Xeons) and their lack of something called 'QuickSync'.

I'm wondering what 5.2K is even for; seat-of-the-pants my best guess is that these extra 1.2K lines of resolution are overlap so that the end product (panning around in Premiere using the VR plugin) is in 4K. Is this correct? Or is there some other reason for the extra resolution; can I use Fusion Studio's 4K render setting and still end up with true 4K video? I hope the latter is true, otherwise I'm SOL and may as well just be sticking with 1080p on my other cameras. Thanks again for your time.

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR

powderific posted:

For the crashing, does the program have an option where you can choose to use or not hardware encoding? Could be a thing where your video card doesn’t have enough memory. It could also just be a buggy program — I’ve never used it.

edit: I can’t imagine why quicksync would make it crash vs just be slow. And there are a lot of different Xeons, are we talking old single core chips or what? What’s the rest of your system?

They're the 6-core 3.46GHz Xeons. The machine is a Mac Pro 5,1 with an 8GB RX580 GPU, 64GB of 1333MHz RAM, and an SSD startup disk on the PCI bus. It's an old machine but I've never really had issues with it crunching the (very basic) video editing tasks I've thrown at it. I don't see anything in Fusion Studio about hardware encoding.

quote:

If you’re using the 360 video as choose-your-frame normal angle in Premiere I’d think you’d want the highest resolution you can get (assuming the source files are that res and it’s not just upscaling.)

Double edit: if the 360 video is 5.2k there’s no way the angle you choose is going to be a full 4K. Think about it, you’re taking a little slice of the big 360 pie, so unless it’s like 75% of the frame it’s not gonna be 4K.

Right, that's why I had assumed that the extra resolution from '5.2K' was to end up at 4K in choose-your-frame, and that's why I elected to try rendering video in it. I'm still thoroughly confused by the ambiguity of this screen that pops up in Fusion Studio when you go to render:



As you explained, 5.2K total wouldn't have enough res to choose a 4K frame... so what is it for? And what if I render at 4K, is that a sphere with 4K resolution, so choosing a frame out of it would be significantly less, like even less than true 1080p? Like I said, I'm confused by the ambiguity of it all and THOUGHT that I'd be able to work with a moving frame in 4K, but I guess not.

Thanks for helping to clear this up, I appreciate it.

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR

powderific posted:

Your machine should be fine for the export I would think. Even if it is a quick sync thing seems like it’s a bug on the part of gopro. Is there anything else you can try to do the stitching?

The 5.2k preset is just to give you as much of the original resolution as possible, so you’ve got everything the source had to work with for reframing. It’s not based on some delivery format, but on what the camera captures. Assuming it’s a Fusion 360, it captures 5.2k so that’s why there’s a 5.2k export preset. How much resolution your reframe has is gonna depend on what kind of fov you pick.

I don't know if there's a third-party program out there to do the stitching but I'll look around. Still not sure why Fusion Studio was hanging up on 5.2K, I'm not even rendering directly from the camera - I copied the files to my HDD and am rendering from there.

By "what kind of FOV you pick" do you mean the parameter inside the GoPro VR plugin (FOV, Yaw, Pitch, Roll, Smooth Transition are the parameters in the plug)? Does that act as a sort of 'digital zoom' reducing resolution if it's engaged at more than 0? I'm still confused as to exactly what resolution I'd be looking at post-reframe in Premiere if I exported from Fusion Studio using the 4K setting with no adjustments of the FOV parameter.

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR

powderific posted:

Yes that FOV. I guess you could think of it as a digital zoom, but really you’re just taking a square slices of the 360 sphere. The FOV is how big of a square you’re taking out of that 360 sphere that’s in 4K or 5.2k or whatever. I haven’t used the program so I don’t know if there’s a way to get it to tell you exactly the output resolution of the reframe, or what the default setting is.

OK thanks for clearing that up. Someone in another thread I think mentioned that 5.2K is actually the resolution necessary to get 1080p out of it, which is a bit bothersome. I'm going to try the 4K renders that I made in a day or so and see if that's true; if they look blown up at 1080p we'll know. Someone suggested that the computer hanging up might just be it trying to go to sleep, so I'm going to run the 'caffeinate' Terminal line and see if I can get the 5.2K render to work.

Another resolution-related question, this time for my little Sony action cams that I have on the bike alongside the 360 cam. Today I was shooting with them set to 4K, with the thought in my head that even if the 360 footage I get only ends up being half-decent at 1080p, the extra resolution from the action cams should allow me to do some stabilization in post without too much distortion. BUT, their batteries do not last very long at all shooting in 4K. Like, less than an hour. Not nearly enough time to rip several runs down my ramps during the day, with traffic. I'm wondering how bad things would look if I simply let them shoot and record at 1080p and did some mild stabilization in Premiere... I have little experience with stabilizing in post but I understand if you don't have some extra res to work with, things can start to look wonky. Is it going to be real bad, or passable? TBH I kind of like a bit of trippy distortion.

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR
Not really sure how to ask this but is it A Thing with action cameras that their colour washes out due to overuse or exposure to elements? The last time I used my Sony FDR-X3000s was to record a motorcycle ride; they were in their protective cases mounted to my bike and helmet but it was a bit nippy outside. Early this morning I used them to record a DJ mix and immediately noticed something very off about their colour.

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR
I haven't tried colour grading yet, no. A friend also suggested that it may also be the white balance settings changed, but I haven't altered anything except switching to and from 4K/1080p recording for some rides.

The room wasn't dim, that's with my Hue bulbs on maximum (though they are behind me) and the Rotolight you can see in frame lighting the gear. Compare to an earlier mix and there's definitely something different going on. I will try those suggestions though, thanks!

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR
Can't think of a better thread for this question.

Is there any way in Adobe Premiere to automatically track an object in frame and apply a blur or mosaic to it? For example, someone's face moving around in frame... or the speedometer of a motorcycle?

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR
Ahh true. I've not really used After Effects much, but I'll give it a shot, there are some tutorials on this very thing. Thanks!

Now comes a workflow question. Like I said, I've not used AE much, and I'm curious about how Adobe's 'bridging' workflow works with video. I've only ever once used Photoshop as an extension of Lightroom to create a fake 'double exposure' before returning to LR to edit the image further, how does this work with video? I imagine there's a similar function for clips or the entire timeline to 'open in After Effects'. I'm guessing the most efficient way to go about this (heavily hinted at by the After part of 'After Effects') would be to edit the video together, fades and dissolves and cuts and playback speed adjustments and all, THEN send it to AE for the automatic masking process. Is this correct? How much processing time am I looking at between the two applications; is exporting from Pr to AE any faster than rendering a finished movie file out of Pr? Can I, will I have to return to Pr to render the final movie file? How much of this process is destructive, i.e: committing to edits?

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR
Big up everyone, thanks for the tips. I'm going to try what powderific mentioned about Premiere's built-in tracker, since what I'm after is fairly rudimentary and doesn't need to do much more than obscure a speedometer that's barely even in frame. I might not even need to do that - since this is pretty shaky helmet camera footage, the other option is to stabilize and crop down... Although Warp Stabilizer says it's going to take its sweet time (~180 minutes for the full unedited seven-minute clip). I'm hoping that's to be expected for a Mac Pro 5,1 with an RX580 GPU.

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR
I've got a question that's more out of curiosity than anything else. Let's say you're filming a shot that involves a television or computer screen. Usually this means you're looking at some frame/field artifacting from the television's refresh rate disagreeing with that of your camera's sensor. But I've seen media (currently I'm rewatching Snatch; in the opening scene there are CRT monitors shot head-on) where that doesn't really happen. Another good example is any news broadcast that has reporters standing in front of a real, actual screen (as opposed to chromakey). Am I correct in assuming there's got to be some sort of synchronization going on, presumably the same signal you'd run between double-system audio recorders, timecode slates, etc., to your camera?

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR
I think I've asked this before, but what is the general best practice for timecode sync and what 'department' usually holds the master clock?

Today I worked as audio recordist on a very amateur project for a friend, and it felt really nice getting back into the swing of things and learning my gear (since I also supplied basically everything for the shoot). It's been a while since I've done anything like this and while today definitely boosted my confidence for future gig opportunities, it took me longer than I wanted to get a timecode solution with the little wireless boxes I had (pair of UltraSync Ones and the Atomos device in the same ecosystem) and throughout the day I became curious about how to best handle master/slave (this nomenclature sucks) relationships with the gear.

At first, the audio recording device was the master and the wireless boxes simply existed as transceivers. But they also generate clock themselves, and I noticed that of all the gear used, their little tiny batteries lasted all day - when I had to replace the AAs in the Zoom recorder I went "I hope this works" and changed its connected One over to internal/master. It didn't skip a beat and powering down the recorder didn't affect any other device's tc signal.

So real-deal film sets must have some high-quality dedicated clocks, right? A box that clocks, and nothing else? I assume so, because recording studios often have the same thing; even if you're using primo Avid converters your million-dollar studio probably also has some incredibly robust clock. Whose department possesses this mythical box and how many slaves does it tend to have?

Going forward in my amateur productions, I think I'm just going to use the li'l wireless box itself to govern the Zoom recorder (wired), and the Atomos monitor and timecode slate (wirelessly). Unless there's some critical fault with this plan? I think the slate itself generates a muscular clock signal but the drat thing runs on six AAs and I'm not aware of a solution to power it while swapping batteries.

Mister Speaker fucked around with this message at 02:36 on Aug 21, 2022

Adbot
ADBOT LOVES YOU

Mister Speaker
May 8, 2007

WE WILL CONTROL
ALL THAT YOU SEE
AND HEAR
Thanks pals. I figured there wasn't inherently much wrong with using the Ones, it just feels odd I guess to rely on such a tiny piece of gear as the clock, but like I said the batteries last forever and I believe you can even hot charge them from USB while they run.

Total newb "I think I know the answer but" follow up: to 'jam' sync is to force a clock reset signal to a device such that it lines up in case it's drifted or was started at a different time, correct?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply