|
Newf posted:A question about cookies. When I arrive here at the something awful forums, my browser is already logged in. You probably want to edit out that 'secret', unless it's already edited / obfuscated. As for help, look at what cookies are sent when you log in and for each request during that session. Then look at what cookies are sent after that. Specifically check the expiration date / time when they are created when you log in. Lumpy fucked around with this message at 17:18 on Jan 29, 2019 |
# ? Jan 29, 2019 01:49 |
|
|
# ? Jun 1, 2024 18:59 |
|
Lumpy posted:You probably want to edit out that 'secret', unless it's already edited / obfuscated. Good catch on the secret, but it's from a couch install on a dev machine that typically isn't connected to the internet. And of course now you've quoted it so I'm sunk regardless I'll have a look at the incoming / outgoing cookies tomorrow and make some more posts posts posts.
|
# ? Jan 29, 2019 04:50 |
|
Newf posted:Good catch on the secret, but it's from a couch install on a dev machine that typically isn't connected to the internet. And of course now you've quoted it so I'm sunk regardless I don't know what you're talking about!
|
# ? Jan 29, 2019 17:19 |
|
So it turns out that I was the problem. I had a closer look and found that the session cookie was still around after the page-refresh, and that requesting the user's personal database viacode:
Related question: if I want users to find themselves 'logged in' when returning to the site after a long absence, is that achieved by just setting a big value for the 'timeout'? Currently it's sitting at 600 (seconds). Is it common practice to set something like 60*60*24*30*6 for a cookie to last ~6 months?
|
# ? Jan 29, 2019 18:31 |
|
I think this is the best thread for my question ~ I have an elastic search server that I run with an endpoint that is exposed only to my local computer and my web server via nginx configuration files. I got javascript APM working last night (badass btw) and I kind of want to go crazy with it (tags / timespans / adding the OIDC user to usercontext) ... but will having a JS app hitting my endpoint that is protected only 'deny all' (exccept 2 IPs) the way i should proceed with this? I don't know what the deal with x-pack is right now - they just changed the name and it's now open source or something, i can't quite find a straight answer. I hope the c# apm agent is done soon. My set up is Vue.JS SPA -> API Gateway -> APIs (c# .net core 2.2) and it works great in production right now.
|
# ? Feb 3, 2019 17:25 |
|
karma_coma posted:I think this is the best thread for my question ~ From what I remember, Elasticsearch is pretty wide-open if you have access to it, right? Is the APM side of it on port 8200 any different? What I did for ES was to have a back end endpoint that made the request itself from the web server so I could keep the nginx IP restrictions. If APM is just open on a single port, then opening up that one thing should be OK. But hopefully there's some kind of flood protection.
|
# ? Feb 3, 2019 20:24 |
|
In dumb web news, people on the Chrome team are pushing for sharp limits on page resources (2 MB per page in images, 500 kb per page in scripts), throttled limits on CPU usage in JS, or both, which to me seems like a great way to violently murder web games and to push users of business web apps to switch to Firefox to get out of a repeat of UAC hell.
|
# ? Feb 3, 2019 22:01 |
|
Roadie posted:In dumb web news, people on the Chrome team are pushing for sharp limits on page resources (2 MB per page in images, 500 kb per page in scripts), throttled limits on CPU usage in JS, or both, which to me seems like a great way to violently murder web games and to push users of business web apps to switch to Firefox to get out of a repeat of UAC hell. They can't be that dumb. Poeple will still make websites bigger than that using dirty hacks. Pride comes before the fall, I guess. If they really want to do that, they could invent a "norma 2M", set these rules. Have websites that follow these rules run on a optimized core of chrome that is faster. Websites will self-manage to be under these limits. Frameworks will invent ways to help operate under these limits. And so on. But he.. maybe theres nothing bad with fat websites. What we really need to do is to sent a few politicians to the firing range so the soldier practice. loving legal nonsense cookie protection and the other bullshit.
|
# ? Feb 3, 2019 22:13 |
|
Tei posted:But he.. maybe theres nothing bad with fat websites. What we really need to do is to sent a few politicians to the firing range so the soldier practice. loving legal nonsense cookie protection and the other bullshit. What
|
# ? Feb 3, 2019 23:07 |
|
Kobayashi posted:What Have you seen the web recently? theres popups everywhere with legal nonsense. These idiot politicians need to be fired.
|
# ? Feb 3, 2019 23:11 |
|
Instead of this 2MB limit, maybe think about the fact that native applications can have more than 2MB of resources and still be snappy. Then think about if the current paradigm of how the web browser + JS works is the best way of closing that delta. A 2MB limit is just a bandaid on a much deeper problem.
|
# ? Feb 3, 2019 23:51 |
|
Tei posted:Have you seen the web recently? theres popups everywhere with legal nonsense. These idiot politicians need to be fired. You talking about GDPR or something different? Cos if it's the former, wtf mate
|
# ? Feb 4, 2019 00:03 |
|
RobertKerans posted:You talking about GDPR or something different? Cos if it's the former, wtf mate Some websites are run by idiots who think that a TOS popup completely immunizes them to GDPR when GDPR specifically says you can't do that.
|
# ? Feb 4, 2019 00:14 |
|
Roadie posted:Some websites are run by idiots who think that a TOS popup completely immunizes them to GDPR when GDPR specifically says you can't do that. The black hat UI popups that take 5 minutes to check and save user preferences (do they really though? or is that just to push people to accept all the tracking cookies?) are the ones that gently caress me off the most, several companies seem to have all realised they can make $$$$$$$ by persuading US media orgs that GDPR is super duper complicated so that they can sell their ultra lovely anti-user software to them :rage: RobertKerans fucked around with this message at 01:05 on Feb 4, 2019 |
# ? Feb 4, 2019 01:02 |
|
I'm all for privacy and knowing what someone's doing with my data, but "Hey! This web site uses cookies!" banners to accept/clear at the bottom of every web site is right there behind pop-ups asking you to subscribe to a web site's mailing list because you spent more than five seconds on it on my web site annoyance list.
|
# ? Feb 4, 2019 01:05 |
|
GI_Clutch posted:I'm all for privacy and knowing what someone's doing with my data, but "Hey! This web site uses cookies!" banners to accept/clear at the bottom of every web site is right there behind pop-ups asking you to subscribe to a web site's mailing list because you spent more than five seconds on it on my web site annoyance list. True, it is annoying, but if this is EU specific, then the actual rules say that if they are necessary to the site functioning, they are allowed, no consent need be given. Many cookie popups are either pointless, or they are actually referring to user tracking cookies so (doesn't mitigate the annoyance though)
|
# ? Feb 4, 2019 01:17 |
|
Roadie posted:In dumb web news, people on the Chrome team are pushing for sharp limits on page resources (2 MB per page in images, 500 kb per page in scripts), throttled limits on CPU usage in JS, or both, which to me seems like a great way to violently murder web games and to push users of business web apps to switch to Firefox to get out of a repeat of UAC hell. That is pretty funny in that this clearly represents giving up making Blink Kit less of a slavering resource hog and figure that since they're Chrome they can just foist that on the rest of the world. On the other hand, smacking down all the sites that load three or more times as much poo poo by volume as they need just to track users and serve ads would be pretty funny... Well I'm torn. Tei posted:But he.. maybe theres nothing bad with fat websites. What we really need to do is to sent a few politicians to the firing range so the soldier practice. loving legal nonsense cookie protection and the other bullshit. Just because some bad advice about cookies escaped from the late 90s into the GDPR doesn't make it a bad law overall, dude.
|
# ? Feb 4, 2019 17:29 |
|
I love javascript and I have been abusing javascript since before it was consider a cool thing. But I lament how pervasive is javascript is everywhere. We are on a point where I would be hard to make websites without javascript. I would not mind having something like a self-downgrade, a meta tag where we could declare "this text/html file don't have javascript, disable all javascript execution within it". And that to had some sort of advantage other than the obvious.
|
# ? Feb 4, 2019 18:28 |
|
I'm not that concerned with how pervasive JavaScript has become, I'm more concerned with the people developing JavaScript throwing all kinds of crap in there though, especially in the nuget library sense.
|
# ? Feb 4, 2019 19:02 |
|
I wonder how much of modern web features are due to the evolution of ECMAScript standards vs evolution of design patterns. Like, how close could we get to a modern library like React using ECMAScript version 1.0 in Internet Explorer 4 or some poo poo? Could we have "invented" something like React much earlier? When did AJAX hit the scene? I know a bit about how HTML and CSS standards have introduced new features over time but not as much about the evolution of JavaScript features.
|
# ? Feb 6, 2019 10:24 |
|
Anony Mouse posted:I wonder how much of modern web features are due to the evolution of ECMAScript standards vs evolution of design patterns. Like, how close could we get to a modern library like React using ECMAScript version 1.0 in Internet Explorer 4 or some poo poo? Could we have "invented" something like React much earlier? When did AJAX hit the scene? I know a bit about how HTML and CSS standards have introduced new features over time but not as much about the evolution of JavaScript features. Has a veteran in this war, I can tell you what stopped Javascript from my point of view: - Nobody knew the good parts of Javascript where this good. I mean, probably the Lisp people knew, but they did not told anyone, nor they show us. People would use anonymous functions, but not abuse anonymous functions. The idea to use anonymous functions to reduce the scope of variables was not popular, so everything was global all the time. - There were really not much to do with javascript. Expectations for what a web must do where low. It was okay to fill a form, send it, and have the validation serverside. - JSON was not invented. People where sharing data between server and client in retarded ways. - Everyone was writting "vanilla javascript" because good frameworks were not popular Technology has never been the problem, beyond speed. People where doing the things you use Ajax using stuff like iframes or image.src. Tei fucked around with this message at 13:40 on Feb 6, 2019 |
# ? Feb 6, 2019 12:29 |
|
Was there any well known site that was really ahead of the game when it came to creating a lot of the page with JS? One of the first I remember was Twitter, but that wasn't terribly long ago...
|
# ? Feb 6, 2019 16:21 |
|
Thermopyle posted:Was there any well known site that was really ahead of the game when it came to creating a lot of the page with JS? Wolfram|Alpha is 2009, so is not old enough. Gmail is 2004, so that one is better. Newgrounds.com had many games made in Flash, and from Flash MX that was ActionScript 2.0 (Javascript). Flash MX was release around 2000. The website itself (newsgrounds) was probably implemented in a traditional way (I have not checked, but seems that way). Google Docs is probably about the biggest thing you can build in Javascript... If is really build in javascript, maybe is written in something else and compiled to javascript. Tei fucked around with this message at 16:51 on Feb 6, 2019 |
# ? Feb 6, 2019 16:49 |
|
Is there a way to mask the contents of a div in a non-rectangular shape in a manner that I can animate easily with either JS or CSS transitions? For example, say I have a div with a search field in it that is hidden by default. When a user clicks a magnifying glass icon, the search field div does an animated "wipe," revealing the search field, but the edge of the wipe isn't a vertical edge, it's diagonal and there's text behind it so I can't use CSS borders to create the shape. Any ideas? Is this even achievable? The only way I can think of is to use an extra background div to create the slanted color block, and then animate the width of the div contents at the same speed as the background. But this sounds like it'd be pretty fragile. I'd like to use clip-path, but it has terrible support in most browsers. kedo fucked around with this message at 17:51 on Feb 6, 2019 |
# ? Feb 6, 2019 17:39 |
|
I don't know. Random stuff: https://davidlynch.org/projects/maphilight/docs/demo_simple.html https://cactusthemes.com/blog/using-css-shapes-to-create-non-rectangular-layout/ Oh, god, I am not helping. How I can delete this post? Tei fucked around with this message at 18:00 on Feb 6, 2019 |
# ? Feb 6, 2019 17:42 |
|
I'm trying to find portfolio-ish websites for freelance software developers from which I can Not web designers...of which bajillions appear in Google search. There's some amount of overlap in target audience, but I'm just wondering if I can find anything specific to this niche. Anyone have anything they've seen which they like? Thermopyle fucked around with this message at 18:47 on Feb 6, 2019 |
# ? Feb 6, 2019 18:45 |
|
I've always been partial to Nir Sofer's portfolio. edit: More serious answer; Most portfolio sites that I've seen for non-web devs that aren't just github profile links tend to be super cringeworthy personal blogs. The Fool fucked around with this message at 19:03 on Feb 6, 2019 |
# ? Feb 6, 2019 18:58 |
|
Thermopyle posted:Was there any well known site that was really ahead of the game when it came to creating a lot of the page with JS? I think Google Maps was the real game-changer because it catapulted XMLHttpRequest into the spotlight. It had already existed for some time but nobody had used it for something so high-profile and technically impressive. Recall how MapQuest and the like worked at the time. Suddenly everybody realised what dynamic resource loading could really do.
|
# ? Feb 6, 2019 20:35 |
|
Does anyone have any articles - with data, preferably - on why all sites should have a Contact Us page? My sales guys are inexplicably resistant to the idea of a Contact Us page, and I think me pointing out that literally every site on the internet has one won't work.
|
# ? Feb 6, 2019 23:42 |
|
LifeLynx posted:Does anyone have any articles - with data, preferably - on why all sites should have a Contact Us page? My sales guys are inexplicably resistant to the idea of a Contact Us page, and I think me pointing out that literally every site on the internet has one won't work. Not a lot of data out there that I could find, but these guys raise an interesting point: https://wiredimpact.com/blog/why-you-should-have-a-contact-page/ If you have a contact us page you can include it in your user flow and engagement metrics in analytics. How many sessions visit it? Do they do so before finalizing an order (ecommerce only obviously)? The other advantages are pretty obvious like Topic based issue routing (e.g. "why are you contacting us?" dropdown) ensures sales inquiries go to sales, support to support etc. There's some real meat in those; I think close to 1/3 of the sales inquiries where I work now come in either the contact page or the chat. I also view it as a signal of trust if a company publishes their actual real physical address on the page instead of just nothing like most ecommerce/consultancy players like to do to either look bigger than they are or avoid revealing they're overseas. Both of the above ideas data-wise kinda require you to have one in place though, so you can measure the effectiveness either through analytics or inbound case analysis. But without one, you won't be able to see that data if it works for your case or not.
|
# ? Feb 7, 2019 00:02 |
|
Are we talking about the olden days of web development? Let me tell you all about how everyone thought CSS was dumb and wouldn't go anywhere back in 1998...
|
# ? Feb 7, 2019 00:44 |
|
Scaramouche posted:Not a lot of data out there that I could find, but these guys raise an interesting point: Thanks! I hope this is enough. kedo posted:Are we talking about the olden days of web development? Let me tell you all about how everyone thought CSS was dumb and wouldn't go anywhere back in 1998... I've been tempted to download Netscape Navigator 3.whatever and try to make something modern-looking using obsolete code in Notepad just for the nostalgia trip, but I think it'd be more frustrating than it sounds.
|
# ? Feb 7, 2019 00:59 |
|
Doom Mathematic posted:I think Google Maps was the real game-changer because it catapulted XMLHttpRequest into the spotlight. It had already existed for some time but nobody had used it for something so high-profile and technically impressive. Recall how MapQuest and the like worked at the time. Suddenly everybody realised what dynamic resource loading could really do. I feel like gmail was the more canonical example that led web developers down the path of the SPA and using XHR to load everything. Google maps was technically impressive but I feel like it didn't become super influential until Google decided that gps manufacturers shouldn't exist and released the android app for it.
|
# ? Feb 7, 2019 01:20 |
|
LifeLynx posted:Does anyone have any articles - with data, preferably - on why all sites should have a Contact Us page? My sales guys are inexplicably resistant to the idea of a Contact Us page, and I think me pointing out that literally every site on the internet has one won't work. Well, at a bare minimum, GDPR requires that sites that provides service to literally anyone in Europe at any time ever have a process to handle requests to delete personal data, under threat of fines that scale up to approximately "all of the money you will ever have in your entire life". Which is lower cost: building out a fully automated system to handle this, or just having a contact email somewhere and a boilerplate blurb about sending GDPR requests there?
|
# ? Feb 7, 2019 09:19 |
|
Microsoft security chief: IE is not a browser, so stop using it as your default IE is not a browser? Cool I can stop supporting it then.
|
# ? Feb 7, 2019 23:54 |
|
Oh boy, IE. Most of the HTML/JavaScript stuff I do these days is used to provide user interaction within workflow processes in a document management system (C#/WPF). It uses the IE 9 rendering engine. The vendor had to give out a warning a couple of years ago when they switched (they were using IE 7 I think it was until 2016). Was fun having to help our help desk upgrade a bunch of customers when the JavaScript calendar controls we were using didn't work anymore. I'd give anything to be able to just enter <input type="date"/> and be done with it.
|
# ? Feb 8, 2019 02:07 |
|
The Merkinman posted:Microsoft security chief: IE is not a browser, so stop using it as your default Its a fun headline but the bulk of what he's saying is pretty on the money. Well I mean it boils down to "IE is bad don't use it" which is a reasonable statement.
|
# ? Feb 8, 2019 10:07 |
|
I've found edge in a lot of cases works better than chrome or other competitors, it's got features and speed. I'm sure there are reasons for switching to webpack but it sure seems like it's going to delay their minimal 64-bit-only Windows Lite project. As webpack is still 32-bit.
|
# ? Feb 8, 2019 10:27 |
|
Nolgthorn posted:I've found edge in a lot of cases works better than chrome or other competitors, it's got features and speed. I'm sure there are reasons for switching to webpack but it sure seems like it's going to delay their minimal 64-bit-only Windows Lite project. As webpack is still 32-bit. Internet explorer was highly popular for blind people because some readers where implement using low level functions and adapted to how IE internals work so obviusly these readers would not work on other browsers. To the dismay of everyone trying to push usability and standars to make the web usable for everyone. Some things can be implemented in two diffrent ways. A propietary extension of a binary blob, or a well documented standard. IE has been many times in the wrong side of that fence.
|
# ? Feb 8, 2019 11:17 |
|
|
# ? Jun 1, 2024 18:59 |
|
My understanding was edge was a rewrite from scratch, they ditched all backwards compatibility for ie stuffs
|
# ? Feb 8, 2019 12:14 |