Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
MrMoo
Sep 14, 2000

Does anyone have recommendations for refactoring a reasonably straight forward but JavaScript heavy site from 2009?

I haven't kept up to date with anything and barely remember anything how it was implemented. It still seems to work reasonably well, it looks like load performance can be improved with newer jQuery versions that have shrunk by 10x the size.


Over the years some technologies have broken and changed, originally used Google Gears for accelerating uploads for administration and that was replaced with HTML 5 spec. Admin pages also used CSS background images for list boxes and every browser has subsequently dropped support there too.

MrMoo fucked around with this message at 02:50 on Oct 6, 2013

Adbot
ADBOT LOVES YOU

MrMoo
Sep 14, 2000

I'd like to remind everyone about a new Web Animations API peculating through development. It is generally about providing an enhanced timeline supporting integration of GPU (CSS-based) and CPU (Javascript-based) animations. There is what is called a "polyfill" implementation in Javascript for browsers without native support, basically it emulates native support but at some level of performance cost:

https://github.com/web-animations/web-animations-js

Chromium has native support currently in development, I'm watching commits for EffectCallback this week.

I picked this up this weekend to implement a basic news ticker proof of concept, I need GPU acceleration because it's across 8 monitors :stare: and rather blows on CPU shoving out an effective 4K screen in software, however I also need CPU support for pulling in and processing new headlines asynchronously.

The docs and specification aren't exactly overly useful but there are oodles of test cases for the polyfill which are useful.

MrMoo fucked around with this message at 19:09 on Mar 17, 2014

MrMoo
Sep 14, 2000

The company I'm presently at loves to outsource operations so what happens is the majority of those sites require > IE8 and thus users need to use Chrome, but all the horrendous internal sites only work IE6-8 so the user has to constantly juggle back and forth.

Still amazed that VMware vCloud Director has really poo poo browser support, who writes something new so broken?



It doesn't work in IE8 or anything using Webkit, Firefox is the only reliable client.

MrMoo
Sep 14, 2000

Kobayashi posted:

I'm not fluent in spec-speak. Does this give more flexibility than cubic bezier curves? More curve points are necessary for iOS-like physics animations without Javascript.

The entire point is you can control CSS GPU accelerated animations with JavaScript, thus you create your own easing mechanism for everything. There is a warning on the draft that they are thinking about limiting the flexibility in the future so you may want to shout on the mailing list to point them in the right direction.

MrMoo fucked around with this message at 00:38 on Apr 25, 2014

MrMoo
Sep 14, 2000

There is JavaScript modifying the inline style with a matrix based transform. The script looks like jQuery NiceScroll plugin via the fact it includes a match on "matrix".

MrMoo
Sep 14, 2000

Suspicious Dish posted:

SVGs can be slower to render than PNGs, especially on mobile, so keep that in mind.

SVGs are drawn in hardware on modern devices and are actually faster than PNG when the file size is smaller. Unfortunately for complex images SVGs can mushroom in size quite quickly.

Biggest problem is Firefox and MSIE are usually terrible at rendering sized SVGs. If you are targeting Chrome and Safari only then it is no problem, you can problem even use the new responsive image or picture tags to provide both.

MrMoo
Sep 14, 2000

.animate() is CPU based, on a mobile platform you want .transition() to use the GPU. The jQuery Transit API should be ideal for your needs.

MrMoo
Sep 14, 2000

Try to work on a jsfiddle that highlights the problem the pastebins are not easy to understand. 60fps animation is difficult, read up on the guides on http://jankfree.org/ for more but it sounds like you have the fundamental 3d transforms in place it is other things that are breaking the performance. Use Chrome's developer tools to help diagnose performance problems, such as repaints and re-layouts. The tools change a little every release so it is an annoying moving target at times, plus docs are non-existent.

My approach is to strip everything down to the bare minimum, show that that works and then slowly add functionality back. You should be able to bisect what is breaking performance.

MrMoo
Sep 14, 2000

Munkeymon posted:

I'm not aware of standards around this - I just see "X is optional" and think immediately of the query, not the path. You've gone one farther and said "X is optional and just modifies the output". /place/id?xml=true is a clearer logical parallel that I think makes the the case a bit better.

However ?xml=true is a special case when a HTTP header should be used instead, Accept: application/xml versus Accept: application/json.

MrMoo
Sep 14, 2000

A hilarious hard tidbit of performance information to come by: if you want a web based marquee across multiple monitors you will see much better performance configuring the monitors vertically and using XRandR to rotate each screen to look correct.

That's the sum of effort of nVidia, Google, NYSE, Reuters, and Zignage for the last month :lol:

Hopefully you will see it more on TV when Alibaba IPOs.

MrMoo
Sep 14, 2000

The allegations are that a search engine should penalize content in display: none as it encourages meta tag style abuse.

MrMoo
Sep 14, 2000

samglover posted:

I'm starting to feel like the front page of my website (lawyerist.com) is a big mess.

It's very similar to another messy site: http://www.cnet.com

MrMoo
Sep 14, 2000

fletcher posted:

What's the cutover for when an HTTP file upload should instead be using FTP? Or is there one? I noticed Amazon S3 supports big file uploads over HTTP, but they also have a Java applet thing (I don't want to go down that route...)

Amazon has chunked HTTP uploads, I guess around 100MB chunks because TCP is supar reliable. I think you can pass a checksum in the POST and the server will verify it.

FTP has no joy at all.

MrMoo
Sep 14, 2000

A browser yes, the command line, maybe something in Java, here's their original announcement:

http://aws.amazon.com/about-aws/whats-new/2010/11/10/Amazon-S3-Introducing-Multipart-Upload/

Another one with a picture and details of how to do it on the command line: basically split the file use curl with a few custom headers.

https://aws.amazon.com/blogs/aws/amazon-s3-multipart-upload/

MrMoo
Sep 14, 2000

The Merkinman posted:

Spartan: Oh great! M$ made their own engine again instead of going with something like WebKit :argh:

Honestly I think the current state of three active participants (Edge, Gecko, Blink) is very good to maintain HTML standards and pull everyone out of the proprietary lock ins. Once the legacy Trident has died there will be minimal problems. The gigantic caveat is that Edge being WX only really is a problem for the next 10 years.

MrMoo
Sep 14, 2000

Facebook or Wix.com (advertised on TV a lot) could be principal candidates. But note a lot of the detail can be included on Google Local, Yelp, and many other services such as Expedia.

MrMoo
Sep 14, 2000

kedo posted:

Don't use Wix, it is sooooooo terrible.

I wondered, due to all the money on advertising they must be burning through.

MrMoo
Sep 14, 2000

The beginners guide worked for me porting from Lighttpd.

MrMoo
Sep 14, 2000

fletcher posted:

Put your config files in /etc/nginx/sites-available and then symlink them to /etc/nginx/sites-enabled

I think that's a Debian thing as I had that on Ubuntu/Lighttpd but not on CentOS/Nginx.

MrMoo
Sep 14, 2000


This suffers the same eventual consistency issues as everything else, research and new products are starting to patch up those issues aiming at ACID compliance.

MrMoo
Sep 14, 2000

I like the canvas/SVG thing LG have on Ars today,

MrMoo
Sep 14, 2000

darthbob88 posted:

What's the Best Practices way to organize an API to use WebSockets? Working on a personal project, I've decided that I'm probably going to need a WebSocket server for sending data to the user, so I'm looking into this library, which seems easy to work with. Given the example on that page, the best option for organizing the API seems to be either a single server, with each call handled in a large OnMessage handler, or a server instance for each API call, but neither feels quite right to me.

It all depends how far you desire to cope with scale. A very elegant method is to use Mongrel2 to fan out all HTTP and WebSocket requests over ZeroMQ then you can have farms of servers handling requests however you desire.

MrMoo
Sep 14, 2000

The Wizard of Poz posted:

Is it normal for a company to insist that their devs do all of their work in a VM?

Seems a perfectly reasonable request, why do you care?

MrMoo
Sep 14, 2000

Latency and slowness should not be an issue on any LAN or even metropolitan network, you may wish to investigate the tools you are using.

For example for consoles use mosh and Windows UI use RDPv8 with UDP enabled, I always work on remote machines on terrible networks and generally it hasn't been too bad. I do use a local web browser though, and remotely they're not always that great and Java UI is almost always terrible.

MrMoo fucked around with this message at 03:05 on Feb 15, 2016

MrMoo
Sep 14, 2000

The Wizard of Poz posted:

Well that's what I assumed, but now we've been told that we can't duplicate each other's VMs and we each need to separately set our own up from scratch.

Probably due to licensing if they are Windows VMs, you can always just use non-activated versions and rebuild every 90 days though so idk?

MrMoo
Sep 14, 2000

Generally a VM needs less memory than a physical server, if you are building the VMs yourself do you have the guest tools appropriate for the VM vendor? They can make a gigantic difference in performance.

MrMoo
Sep 14, 2000

I believe TinEye was created to do that.

MrMoo
Sep 14, 2000

It is pretty swell, barely any competition worth anything. Built into MSVC 2015 under a retarded long Azure product name.

Supported by a lot of sites and similar functionality in Google's API Explorer.

Kinda slow moving standard to add all the nice features for auto-generated APIs, getting there.

MrMoo
Sep 14, 2000

Stay the hell away from OData, what a mess.

MrMoo
Sep 14, 2000

fuf posted:

just install Visual Studio (free Community edition) and you're set.

Microsoft still insist on complicated licensing arrangements, the "free" edition can never just be "free":

quote:

An unlimited number of users within an organization can use Visual Studio Community for the following scenarios: in a classroom learning environment, for academic research, or for contributing to open source projects.

For all other usage scenarios:
In non-enterprise organizations, up to five users can use Visual Studio Community. In enterprise organizations (meaning those with >250 PCs or >$1 Million US Dollars in annual revenue), no use is permitted beyond the open source, academic research, and classroom learning environment scenarios described above.
https://www.visualstudio.com/products/visual-studio-community-vs

MrMoo
Sep 14, 2000

I presume some kind of propagation delay between the DNS and web configurations that ends up pointing somewhere strange.

MrMoo
Sep 14, 2000

You need straight TCP forwarding which is easier in something like HAProxy.

MrMoo
Sep 14, 2000

Swagger (or OData which can generate Swagger).

1: Documentation.
2: Online testing of the API.
3: Automated generation of platform SDKs from an API decl.
4: Automated generation of API decl from service implementation.

3 & 4 are highly a work in progress and are completely poo poo for some combinations.

MrMoo
Sep 14, 2000

Windows XP was the blocker, no problem now though.

MrMoo
Sep 14, 2000

fuf posted:

When you try and verify a google search console property using an existing analytics tracking code it says:
"Your tracking code should be in the <head> section of your page."

Google tracking builds up a run time object with events to send async for logging, it would appear some event is ready to be tracked but the record has not been defined yet.

MrMoo
Sep 14, 2000

Is there any standard for wholesale -> retailer website integration for supporting drop shipments? That's a fairly wide net as it is and could mean so many things.

MrMoo
Sep 14, 2000

Been challenged to create a 60fps synchronized video wall in HTML5. Starting with single computer, multiple port, and increasing up towards multi-computer synchronized play for much bigly displays. I may start a project.log as the last time it took 3 months to get news & quote tickers running across 6 displays.

Choice technologies are DASH streaming with WebRTC based sync, I've seen multiple Chrome Experiments using this method quite successfully since 2013

https://vimeo.com/60992231

MrMoo
Sep 14, 2000

It's targeting a customizable signage solution type thing so they have single super size videos that cover the entire wall, but from what I've seen they have been split up because I don't think anything is going to play 12x4k resolution in a hurry. They also have a lot of odd videos that need to be played side by side and looped in various forms, all different lengths and sizes.

Easy first step would be to at least preprocess the video so that you have easier chunks to work with.

ooh, one of the target walls is this thing, so sometimes single adverts, sometimes multiple:




and this the first wall to fix as it is looking super with tearing and other awesome artifacts:

MrMoo fucked around with this message at 18:55 on Dec 1, 2016

MrMoo
Sep 14, 2000

Skandranon posted:

Very cool indeed, this is pretty close to my industry actually... Do you know how many frames you can be off? I'm guessing very few. I would be surprised if you could offer any sort of guarantee of frame accuracy via HTML5, especially at 60fps. Is it an absolute "drat the torpedoes" decision to use HTML5 video? If not, I would first try to really clarify what your tolerances are and if you think that can be met, before investing too much into an avenue which may never pan out the way it needs to.

It's a hard 60fps, this is what the tickers already run* at NYSE with live data at 4K resolution. On gigantic display boards any visual errors are significantly amplified over a desktop environment which is why it is rather challenging. One could say the new implementation is more of a commodity by leveraging Chromium for everything, previous implementations are bespoke hardware and bespoke software and thus with additional costs and maintenance. There is a big question on tolerances which is why previously I used WebAnimations for an independent frame rate and sub-pixel rendering which ultimately proved incredibly powerful and visually better than pure CSS hardware animation that has browser and system jitter interrupting everything. The primary question is whether this entire thing is feasible in a web browser and thus that is my project from today.

It looks like it has been done before, the Chrome Pixel Tree project used a Node.js server for synchronization and relaying of control signals. If one looks at something like the Jet Blue Terminal at various airports they have a super sized display:



Is that real wide display or just 10 monitors repeated? Is it in a web browser and is it 60fps? idk, looks pretty nice though.

LAX JetBlue has a couple infamous Windows XP desktops showing so probably some custom app or Flash monstrosity.

MrMoo fucked around with this message at 22:42 on Dec 1, 2016

Adbot
ADBOT LOVES YOU

MrMoo
Sep 14, 2000

4k is the glorious combination of 6 HDTV's at the joke resolution of 1366x768 or whatever the usual low end size is. However that is all on one PC, it is going to be interesting scaling up either with a server or peer-to-peer. The other important caveat is that the servers are low profile boxes that you cannot jam a nVidia Titan into.

For some reason nVidia have special cards for digital signage but they are a generation or two behind those for gamers, we already had to use better hardware for the tickers compared to the regular wallboards like the one showing the DJI index. They provide bezel correction and allegedly frame sync.



Behold the power of flot.

MrMoo fucked around with this message at 22:51 on Dec 1, 2016

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply