<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[Stories by Andrea Giammarchi on Medium]]></title>
        <description><![CDATA[Stories by Andrea Giammarchi on Medium]]></description>
        <link>https://medium.com/@webreflection?source=rss-cc83da4b8256------2</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Wed, 22 Apr 2026 10:25:16 GMT</lastBuildDate>
        <atom:link href="https://medium.com/@webreflection/feed" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[About modern AI and DX]]></title>
            <link>https://webreflection.medium.com/about-modern-ai-and-dx-a7e3da5d7464?source=rss-cc83da4b8256------2</link>
            <guid isPermaLink="false">https://medium.com/p/a7e3da5d7464</guid>
            <dc:creator><![CDATA[Andrea Giammarchi]]></dc:creator>
            <pubDate>Mon, 23 Mar 2026 16:53:41 GMT</pubDate>
            <atom:updated>2026-03-23T17:08:36.191Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/640/1*3YwZuvi7UcBFyuZZx5uirw.jpeg" /><figcaption>Picture from <a href="https://wagwalking.com/symptom/why-is-my-dog-biting-his-tail">https://wagwalking.com/symptom/why-is-my-dog-biting-his-tail</a></figcaption></figure><p>It’s that time of the quarter I need to quickly (yet consciously) ship results, and gosh if my landscape of projects and libraries is fragmented these days … this story is about me unleashing the power of AI while wondering what is it that we are doing wrong, if anything!</p><h4>The Pre-AI Era</h4><p>We’ve all been there, and too many of us (developers) still are, the time where search engines’ answers mattered the most; the time where seniors would dig into every single detail they could through any project that had an easy to reason about or surf, to learn or use, documentation or APIs related pages, a time where “<em>learning-before-using</em>” was the norm, not the after-math.</p><p>During this time, choosing the best projects with the most satisfying websites, and with the best documentation out there, has been proven to be a successful strategy for <strong>both</strong> developers and AI training: caveats, edge cases, examples, the more the better to indirectly shape what came next!</p><p>During those days though, side projects or projects with not enough developers behind such documentation effort became, out of the blue, irrelevant … not because their ideas or solutions wouldn’t be the best out there for that specific project/idea, simply because <em>SEO</em> and/or training data wouldn’t include those projects/solutions in.</p><h3>The AI Era</h3><p>Please keep “<em>this is my experience so far</em>” as a constrain around anything I am going to tell you, but today I had a choice around a topic:</p><ul><li>learn how to do it the best way, ask reviews, improve my knowledge around that topic and everything else … or …</li><li>ask AI to port one PL related thing to another PL related thing …</li></ul><p>Due time constraints, I went for the second approach: I’ve asked AI to help me bringing something strictly JS and NodeJS related, to something strictly Python and Python only related, and it was “<em>a few minutes in the making</em>“: success 🥳</p><h4>The Questionable</h4><ul><li>have I learned anything new in that process? <strong>NO</strong></li><li>have I reviewed and tested everything produced worked at least as seamlessly as the NodeJS counter-part I’ve asked to migrate? <strong>YES</strong></li><li>am I satisfied about the outcome? Also <strong>YES</strong></li></ul><p>After all, without learning much around a topic I actually know a lot about in one PL I’m familiar with, I’ve managed to publish something for another environment and programming language I can surely read and write, but it’d take much longer for me to migrate as a whole.</p><h3>The Elephant in the Room</h3><p>In my quick, easy, tested and working, migration I’ve never needed to question any of this:</p><ul><li>what do those dependencies do?</li><li>do I need the entirety of those dependencies for this purpose?</li><li>is that the best way I can use such module API?</li><li>how many things that API offers anyway?</li><li>is it that easy to publish this module on another ecosystem?</li></ul><p>Now, please bear with me, I am not that naive to publish something that couldn’t make sense out there, no matter the target platform or programming language, but in this whole process I’ve never needed to look at a single “<em>beautifully documentation API related page</em>” because AI had it all covered … and I started wondering …</p><ul><li>are beautiful and styled, modern, fast or efficient, documentation sites around any technology relevant in these days? AI doesn’t care about latest CSS hack to make the Web beautiful, all it cares is about API and documentation content: that’s literally it!</li><li>as modern developers use more and more AI, does anything more complicated than a Markdown page makes sense to both show them eventually some extra content or tell AI how to fix/solve/work around a specific issue?</li><li>is the Web, beside being the best target platform for everything out there, as it’s almost always surely available as a view for consumers, still relevant for developers, where all they need is more like an <a href="https://agents.md/">AGENT.md</a> file, as opposite of something beautiful to read and learn from?</li></ul><h3>My Current Thoughts</h3><p>The learning curve is moving from “<em>which library would you use to solve this problem</em>” to “<em>which AI is best at helping you solve this problem</em>” these days, and the answer is not trivial, the pattern is: nobody cares about beautiful documentation out there, everyone and everything cares about the availability of such documentation at least in its most valuable form, which is <strong>Markdown</strong> format, <strong>not HTML</strong>, and I wonder if we’re shifting from an “<em>impress developers with such docs</em>” era to an “<em>impress AI with easy to grasp knowledge and solutions</em>” one, where content becomes again “<em>the King</em>” over aestetic or “<em>playfulness</em>” of solutions: AI playing with that has no value, rather wasted tokens, developers using AI don’t read, surf or learn anything anymore, at least nothing they wouldn’t care about when the topic is outside their scope and already solved elsewhere with ease thanks to better AI related documentation.</p><h4>Is this inevitable?</h4><p>Yes, I think we’re facing times where nobody will learn anytihing extra or unnecessary and the details, caveats, edge cases, will be covered only by the most updateed AI out there, yet that knowledge and experience is (not so) slowly getting lost or irrelevant.</p><p>Nowadays it feels like a good-looking documentation site is an excercise that doesn’t bring value anymore, if not for developers coming from the pre-AI era, ’cause anyone else won’t ever even look at that website.</p><p>To me DX means helping developers, and if developers won’t look at anything online anymore, we gotta change the way we ship software: AI first or it never happened in the modern ecosystem 👋</p><h3>P.S.</h3><p>I am not trying to tell anyone documentation sites don’t matter, quite the opposite … I am trying to tell that the more playful is the documentation site for humans these days, the least AI in every human writing code these days would matter.</p><p>We need to understand old days are gone by all means, we should make some effort to make modern days more appealing for the new eyes of these times: AI eyes, not human anymore, I am afraid.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=a7e3da5d7464" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[JavaScript “protected” properties]]></title>
            <link>https://webreflection.medium.com/javascript-protected-properties-9f497c35d801?source=rss-cc83da4b8256------2</link>
            <guid isPermaLink="false">https://medium.com/p/9f497c35d801</guid>
            <category><![CDATA[javascript-development]]></category>
            <category><![CDATA[javascript]]></category>
            <category><![CDATA[javascript-tips]]></category>
            <dc:creator><![CDATA[Andrea Giammarchi]]></dc:creator>
            <pubDate>Tue, 14 Oct 2025 08:48:22 GMT</pubDate>
            <atom:updated>2025-11-20T09:26:45.084Z</atom:updated>
            <content:encoded><![CDATA[<p>Private JS fields are both cool and “<em>wasteful</em>”, let me expand on that.</p><pre>class A {<br>  #value;<br>  constructor(value) {<br>    this.#value = value;<br>  }<br>}<br><br>class B extends A {<br>  #value;<br>  constructor(value) {<br>    super();<br>    this.#value = value;<br>  }<br>}</pre><p>If we create new B(&#39;secret&#39;) the instance will have 2 private fields:</p><ul><li>the #value that is defined through the A class, a reference that only A class definition can reach/change/read/use and that will throw if anything in B tries to reach it, as in super.#value — nope nopity nope, that will be a <em>SyntaxError: Unexpected private field</em></li><li>the #value that is defined through the B class, a reference that only B class definition can reach/change/read/use</li></ul><p>The “<em>wasteful</em>” part of this specification can be described as such:</p><ul><li>we need accessors to eventually be able to at least read that super value but accessors fully invalidate the secret/private nature of that field, if the reason to expose these is to have a way for the inherited class to deal with its inherited internals</li><li>the field needs to use “<em>two private slots</em>” instead of one, something that the protected keyword would’ve solved, something that does not exist (yet?) in the JavaScript Programming Language</li></ul><pre>// ⚠️ this does not exist/work<br><br>class A {<br>  // @protected meta example<br>  &amp;value;<br>  constructor(value) {<br>    this.&amp;value = value;<br>  }<br>}<br><br>class B extends A {<br>  log() {<br>    console.log(this.&amp;value);<br>  }<br>}<br><br>new B(&#39;protected&#39;).log();<br>// &#39;protected&#39;</pre><h3>A workaround for protected fields</h3><p>Because only at class definition time we can access private fields, what if we wrap such access in a way that is still “<em>private</em>”, at least per module scope, and grants either read or write access?</p><pre>// 🥳 this works wonderfully<br>class A {<br>  // #value read/write<br>  static value(self, ..._) {<br>    if (_.length) [self.#value] = _;<br>    return self.#value;<br>  }<br><br>  #value;<br><br>  constructor(value) {<br>    this.#value = value;<br>  }<br>}<br><br>// extract the static method and ...<br>const { value } = A;<br>// ... erase the static method !!!<br>delete A.value;<br><br><br>// ... there we go ...<br>class B extends A {<br>  log() {<br>    console.log(value(this));<br>    // change value via<br>    // value(this, &#39;change&#39;)<br>  }<br>}<br><br>new B(&#39;secret&#39;).log();<br>// &#39;secret&#39;</pre><h4>… plus a quick helper …</h4><p>Because forgetting to remove the static field might happen and because this pattern works also for private methods, wouldn’t be cool to use a tiny helper that helps us extracting such properties/fields?</p><pre>const _protected = Class =&gt; new Proxy(Class, {<br>  get(Class, staticField) {<br>    const value = Class[staticField];<br>    delete Class[staticField];<br>    return value;<br>  }<br>});<br><br>const { value, method } = _protected(A);</pre><h3>Update: an even simpler approach 🥳</h3><p>Huge thanks <a href="https://x.com/tombl_/status/1978032755947577777">@tombl_</a> for pointing out there’s not even a need to delete the static field because a static block would work the same:</p><pre>let value;<br>class A {<br>  static {<br>    // #value read/write<br>    value = (self, ..._) =&gt; {<br>      if (_.length) [self.#value] = _;<br>      return self.#value;<br>    };<br>  }<br><br>  #value;<br><br>  constructor(value) {<br>    this.#value = value;<br>  }<br>}<br><br>class B extends A {<br>  log() {<br>    console.log(value(this));<br>  }<br>}<br><br>new B(&#39;secret&#39;).log();<br>// &#39;secret&#39;</pre><p>Gotta admit I often forgot about the <a href="https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Classes/Static_initialization_blocks">static block in classes</a>, this makes the helper obsolete already, which is great!</p><p>And “<em>that’s all folks</em>”, if you ever need to expose internally or within a private scope private fields, now you know a little trick to do so!</p><p>Enjoy JS 👋</p><h4>A TS friendly alternative that is also as fast as it can get? Sure!</h4><pre>/**<br> * @template T,V<br> * @typedef {(self: T) =&gt; V} AccessorGet<br> */<br><br>/**<br> * @template T,V<br> * @typedef {(self: T, value: V) =&gt; V} AccessorSet<br> */<br><br>/**<br> * @template T,V<br> * @typedef {Object} Accessor<br> * @property {AccessorGet&lt;T,V&gt;} get<br> * @property {AccessorSet&lt;T,V&gt;} set<br> */<br><br>/**<br> * @template T,V<br> * @param {AccessorGet&lt;T,V&gt;} get<br> * @param {AccessorSet&lt;T,V&gt;} set<br> * @returns {Accessor&lt;T,V&gt;}<br> */<br>const accessor = (get, set) =&gt; ({ get, set });<br><br>/** @type {Accessor&lt;A, string&gt;} */<br>let _value;<br><br>class A {<br>  static {<br>    _value = accessor(<br>      self =&gt; self.#value,<br>      (self, value) =&gt; (self.#value = value),<br>    );<br>  }<br><br>  /** @type {string} */<br>  #value;<br><br>  /** @param {string} value */<br>  constructor(value) {<br>    this.#value = value;<br>  }<br>}<br><br>class B extends A {<br>  log() {<br>    console.log(_value.get(this));<br>  }<br>}<br><br>new B(&#39;secret&#39;).log();<br>// &#39;secret&#39;</pre><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=9f497c35d801" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Taming JS Proxy API]]></title>
            <link>https://webreflection.medium.com/taming-js-proxy-api-d38e1f425f51?source=rss-cc83da4b8256------2</link>
            <guid isPermaLink="false">https://medium.com/p/d38e1f425f51</guid>
            <category><![CDATA[proxy]]></category>
            <category><![CDATA[web-development]]></category>
            <category><![CDATA[wasm]]></category>
            <dc:creator><![CDATA[Andrea Giammarchi]]></dc:creator>
            <pubDate>Tue, 15 Jul 2025 10:10:57 GMT</pubDate>
            <atom:updated>2025-07-15T10:19:46.268Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*ZYteXiTKx6CM7sWsdvOKhQ.jpeg" /><figcaption>Photo by <a href="https://unsplash.com/@juanster?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Juan Davila</a> on <a href="https://unsplash.com/photos/lake-under-blue-sky-during-daytime-P8PlK2nGwqA?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Unsplash</a></figcaption></figure><p>With the plethora of libraries dealing with remote references and the amount of WASM targeting runtimes, I think it’s time to explain how proxies should be used to mimic in the best possible way types discrepancies across programming languages and whatnot.</p><h3>Rule #1 — the 3 kind of Proxy</h3><p>It doesn’t matter what you hold as proxy reference but it does matter <strong>how</strong> you hold that. Native JS APIs are able to “<em>drill</em>” into proxied references so that:</p><ul><li>typeof proxy should return either object or function</li><li>Array.isArray(proxy) should return true if the reference is meant to be used, or behave, like a JS Array (tuples, lists, collections … in JS these are likely handled as arrays)</li><li>both get, set and has traps should take symbol keys into account, instead of failing when symbols are checked or accessed. Please note that Object.prototype.toString.call(proxy) will access implicitly Symbol.toStringTag, as example … be sure your proxies can handle these scenarios or simply return nothing when typeof key is not string</li></ul><h4>Proxy Object</h4><pre>const proxy = new Proxy({ ref }, {<br>  get({ ref }, key, receiver) {},<br>  set({ ref }, key, value, receiver) {},<br>  has({ ref }, key) {},<br>});</pre><p>Assuming ref is a pointer to your real value, if such pointer goal is to mimic object literals, records or dictionaries like references, proxying { ref: value } guarantees the surrounding code will handle that reference as plain object.</p><h4>Proxy Array</h4><pre>const proxy = new Proxy([ ref ], {<br>  get([ ref ], key, receiver) {},<br>  set([ ref ], key, value, receiver) {},<br>  has([ ref ], key) {},<br>});</pre><p>Any iterable should be referenced as such to be sure that checks such as Array.isArray(proxy) would return true instead of false, playing very well along with all regular JS libraries and expectations when lists, collections, tuples, call it as you like, are meant to be handled like arrays.</p><h4>Proxy Function &amp; Class</h4><p>There are various ways to reference something that is meant to be invoked but here’s the tricky part: other programming languages might not have all function variants present in JS (arrows, short-hand methods, legacy functions, modern classes) so that the “<em>one solution to rule them</em>” all is:</p><pre>// reusable for all function cases<br>function proxied() {<br>  &#39;use strict&#39;;<br>  return this;<br>}<br><br>const proxy = new Proxy(proxied.bind(ref), {<br>  construct(target, args, newTarget) {<br>    const ref = target();<br>    // ... return new instance of that ref,<br>    // or throw if that ref is not a class ...<br>  },<br>  apply(target, context, args) {<br>    const ref = target();<br>    // ... return the invoke of that ref,<br>    // or throw (maybe?) if that ref requires `new` ...<br>  },<br>});</pre><p>With above traps, the typeof proxy will return function as expected and both invokes with or without new will be possible. Of course it’s also possible to fine-tune and branch out specific cases, one where new is never desired or one where new is always desired:</p><pre>// it fails with new<br>const arrow = new Proxy(() =&gt; ref, {<br>  apply(target, _, args) {<br>    const ref = target();<br>    // ... invoke the ref with args and no context<br>  },<br>  // construct trap won&#39;t ever be invoked even if defined<br>});<br><br>class Proxied {}<br>class ProxiedHandler {<br>  // the proxy handler constructor<br>  constructor(ref) {<br>    this.ref = ref;<br>  }<br><br>  // the actual Proxy trap for `new`<br>  construct(_, args, newTarget) {<br>    // use the handler ref property<br>    const { ref } = this;<br>    // return a new instance for that ref<br>  }<br><br>  // apply trap won&#39;t ever be invoked even if defined<br>}<br><br>// if will fail without new<br>const Class = new Proxy(Proxied, new ProxiedHandler(ref));</pre><p>I am letting you decide which approach is better or easier to reason about but I am usually handling things via proxied.bind(ref) because it always work and if it needs to fail when new is used or not: <em>let if fail</em>!</p><h3>Rule #2 — don’t repeat proxy handlers</h3><p>When proxies are used for <em>WASM</em> interoperability reasons or for reflecting API purposes, or simply any other <a href="https://en.wikipedia.org/wiki/Foreign_function_interface">FFI</a> use case, it’s very likely that the runtime will handled hundreds, if not thousands, proxied references during the lifecycle of the program.</p><p>On top of that, things might become easily slower than these need to be, so that keeping in mind that proxy handlers can be shared, or can actually share their traps if these are instances and not just literal, will save a lot of Garbage Collection work + it will be definitively faster and ligther than it is now.</p><p>I am guilty as charged in this post because for brevity, context, and simplicity sake I have used objects literals as handlers, but the truth is that none of my heavily Proxy based libraries use runtime object literals for handlers as that 99.9% of the time a sloppy slippery slope for both performance, RAM, and GC pauses.</p><p>Accordingly, every time you write something like this that could occur more than once:</p><pre>const proxied = ref =&gt; new Proxy({ ref }, {<br>  get(target, key, receiver) {},<br>  // ...<br>});</pre><p>You should rather refactor that as such:</p><pre>const objectHandler = {<br>  get(target, key, receiver) {},<br>  // ...<br>};<br><br>const proxied = ref =&gt; new Proxy({ ref }, objectHandler);<br><br>// ... or even ...<br><br>class ObjectHandler {<br>  constructor(ref) {<br>    this.ref = ref;<br>  }<br><br>  // traps<br>  get(_, key, receiver) {<br>    const { ref } = this;<br>    // ...<br>  }<br><br>  // ...<br>}<br><br>const proxied = ref =&gt; new Proxy(<br>  // good enough to mimic literals<br>  Object.prototype,<br>  // ref is handled directly<br>  new ObjectHandler(ref),<br>);</pre><p>As result, your code will be cleaner, faster and less greedy on the RAM, plus each handler could itself carry more interesting data that its prototype can reuse across handlers, change state, track things, use this context when needed to point at such handler and so on.</p><h3>Rule #3 — trap only what needs to be trapped</h3><p>If you have a handler which goal is to define an object or an array and you have apply and construct traps in there, that’s both confusing and wrong.</p><p>Proxies are special “<em>beasts</em>” in JS and despite their names not everything will be invoked so that those function related traps, as example, won’t happen with object literals … we need to be smarter there, for instance using the proxied.bind(ref) approach and then add all sort of traps to make that behave both as literal and function but then again, that will make such proxy an “<em>alien looking</em>” kind of object with typeof <em>function</em> but both invokable and usable as object literal? … well, you do you but only if you really know what you are doing, so it’s important to learn all traps potentials, when these are needed and when these are not, and leave all other traps behind the proxied object that will already answer properly to its expected behaviors.</p><h4>… and be careful with Arrays</h4><p>Proxied arrays expect to always return at least length as list of ownKeys but that’s granted if you proxy [ ref ], but when such length is retrieved it cannot be just 1, it has to reflect the real length or size of whatever it’s pointing at.</p><p>Same goes with Symbol.iterator which must be the one that will iterate over the real ref, not the proxied reference, and so on so … add tests to your proxies and be sure all edge cases, or most edge cases, are covered.</p><p>And that’s a wrap; I hope you’ve found something useful or new in this post and you’ll manage to improve your proxies within your project sooner than later so that interoperability will improve, as well as performance and RAM usage will be reduced, as well as “<em>surprises</em>” when dealing with proxied variables.</p><p>If there’s only an extra point to make in this post is that <strong>proxies break the structured clone algorithm</strong> and cannot travel across workers so that having any way to recognize proxies and prevent these from being sent elsewhere is always a good idea, among a way to serialize these so that these can be reflected elsewhere … and if you don’t know how or where to start, remember that both <a href="https://github.com/WebReflection/coincident#readme">coincident</a> and its <a href="https://github.com/WebReflection/reflected-ffi#readme">reflected-ffi</a> use all best practices to work seamlessly across tabs, workers, or even the server side of affairs 👋</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=d38e1f425f51" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Some Service Worker Absurdity]]></title>
            <link>https://webreflection.medium.com/some-service-worker-absurdity-344fca5ecdea?source=rss-cc83da4b8256------2</link>
            <guid isPermaLink="false">https://medium.com/p/344fca5ecdea</guid>
            <category><![CDATA[web-development]]></category>
            <category><![CDATA[pwa]]></category>
            <category><![CDATA[service-worker]]></category>
            <dc:creator><![CDATA[Andrea Giammarchi]]></dc:creator>
            <pubDate>Thu, 19 Jun 2025 13:50:50 GMT</pubDate>
            <atom:updated>2025-07-29T06:28:59.806Z</atom:updated>
            <content:encoded><![CDATA[<p>It’s been years since this lovely primitive landed on all relevant browsers but the current state feels stuck in the 90s’ and is hostile for no reason:</p><ul><li>if the <em>ServiceWorker</em> is registered as a module <em>Firefox</em> might throw out of the box when statically importing more complex files</li><li>in <em>Chrome/ium + Edge</em> browsers you cannot use lovely shortcuts such as https://esm.run/@project/module because it does a redirect and, apparently, that’s not allowed</li><li>if you use the right <em>URLs</em> with no redirects, Firefox breaks and those imports also cannot be asynchronous, module or not … the dynamic import(...) is not allowed for “<em>reasons</em>” nobody can actually reason about (keep reading)</li><li>if the ServiceWorker is not registered as a module, we’re off with blocking/synchronous <a href="https://developer.mozilla.org/en-US/docs/Web/API/WorkerGlobalScope/importScripts">importScripts</a>, a Jurassic, awkward, and synchronous, global polluter for an API space where nothing can block or be synchronous by design ( see event.respondWith(value:Promise) and caching based APIs, all async )</li></ul><p>Now … “<em>what if I told you</em>” that none of these imposed limitations or forever unresolved bugs make any sense, when one could write <a href="https://github.com/WebReflection/examples/blob/main/ts/tsw.js">a ServiceWorker like this one</a> and no browser would complain about it?</p><pre>let defaultValue = null;<br><br>const dflt = (_, value) =&gt; {<br>  defaultValue = value.trim();<br>};<br><br>const named = (_, values) =&gt; {<br>  const literal = [`default:${defaultValue ?? &#39;null&#39;}`];<br>  for (const exports of values.split(&#39;,&#39;)) {<br>    const [ref, name] = exports.split(&#39;as&#39;);<br>    literal.push(`${name.trim()}:${ref.trim()}`);<br>  }<br>  return `\nreturn {${literal.join(&#39;,&#39;)}};`;<br>};<br><br>const amaro = fetch(&#39;https://esm.run/@webreflection/amaro&#39;).then(r =&gt; r.text()).then(<br>  code =&gt; Function(<br>    code<br>      .replace(/\/\/# sourceMappingURL=.*$/, &#39;&#39;)<br>      .replace(/export\s+default([^;]+?);/, dflt)<br>      .replace(/export\s*\{(.+?)\}/, named)<br>  )()<br>);<br><br>// ... the rest of the logic</pre><h4>All limitations just vanished 🤦</h4><ul><li>I am effectively using <strong>a CDN with a redirect</strong> to …</li><li><strong>dynamically import</strong> anything I want and …</li><li><strong>evaluating runtime</strong> transformed <strong>code </strong>…</li><li><strong>without blocking</strong> anything at all!</li></ul><p>Now bear with me, except for the last point of this list it feels like a circus of bad practices all compressed in a few lines of code … fair … but can we agree that there is nothing stopping me to bypass those absurd limitations, limitations or inconsistencies (Firefox breaking on static imports) that brought me to write that “<em>horror show</em>” of working code?</p><p><em>What’s that code for?</em> Well, <a href="https://github.com/WebReflection/amaro#readme">I’ve ported NodeJS amaro to the Web</a> and a user asked me if it could transpile on the fly .ts files via regular imports intercepted via a Service Worker so that <a href="https://webreflection.github.io/examples/ts/">I’ve created a Proof of Concept</a> that showcases just that and … it works 🥳</p><h4>No counter-arguments</h4><p>One could argue that dynamic imports are bad because (really … why is that?) but if I need to evaluate code in a SW the situation just got worst.</p><p>The follow up comment would be “<em>yeah but … you wrote that Service Worker code and you cannot use a Service Worker from a CDN, it’s your own code in there</em>” and so was my own intent to use dynamic imports when/if needed (think about polyfills too for primitives and APIs not there yet).</p><p>It’s not that me evaluating code in my file is better than me explicitly importing a well known and maintained module from a well trusted CDN, you know? So why cannot we have dynamic imports when actually we can if we stretch a bit our nonsense beyond the “<em>you must not use eval</em>”?</p><p>I don’t want to write that code but if it’s the only way to make my site/service better, why should I listen to standards that have no concrete reasons to impose those limitations, when the alternative is to use blocking old primitives that could also be generated at runtime via server after <em>UserAgent </em>sniffing and pile up the list of bad practices as a result?</p><h4>Extremely hard to debug</h4><p>For all these Web standards promoters, use PWA and yada-yada, debugging a Service Worker has been, and still is, one of the most complicated and convoluted experience the Web development flow has to offer … how can we promote PWAs if nobody can easily understand what’s going on in there? Please let’s focus a bit on these “modern” primitives left as a developer excercise to master, circumvent, and bypass in limitations so that we can really tell the story about how easy it is to develop for the Web?</p><h3>&lt;/rant&gt;</h3><p>I am now facing an issue that appears only on iPad (not on iPhone) and a slightly different one that appears only on Firefox and you cannot imagine how frustrating it is, among other thousand things I would like to focus on, solve these things that work seamlessly on Chrome/ium + Edge so thank you for your patience in reading this little rant but please help me reach out people behind browsers that could empathize with my sentiment and help moving forward around this topic: very much appreciated 🙏</p><p>P.S. yes, I’ve filed tons of bugs in Firefox / Mozilla … most of those are stuck forever with no follow-up, updates, fixes, nothing … dare I say Firefox is already dead but they don’t really want to announce it … the amount of APIs that break and remain broken for years start becoming unbearable but breaking on static imports out of a module? Hell no, that’s unacceptable folks, I am both afraid and sorry for this browser 😢</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=344fca5ecdea" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Surviving the Structured Clone algorithm]]></title>
            <link>https://webreflection.medium.com/surviving-the-structured-clone-algorithm-130608b69f47?source=rss-cc83da4b8256------2</link>
            <guid isPermaLink="false">https://medium.com/p/130608b69f47</guid>
            <category><![CDATA[web-development]]></category>
            <category><![CDATA[workers]]></category>
            <category><![CDATA[serialization]]></category>
            <dc:creator><![CDATA[Andrea Giammarchi]]></dc:creator>
            <pubDate>Mon, 19 May 2025 14:40:02 GMT</pubDate>
            <atom:updated>2025-05-19T14:53:55.196Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*J6JiRE5CPxkC6pWxEBPgLw.jpeg" /><figcaption>Photo by <a href="https://unsplash.com/@alain_pham?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Alain Pham</a> on <a href="https://unsplash.com/photos/construction-frame-P_qvsF7Yodw?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Unsplash</a></figcaption></figure><p>Used internally by <em>IndexedDB</em>, <em>Workers</em> and other communication <em>Channels</em>, or used directly via structuredClone global utility, the <a href="https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Structured_clone_algorithm">structured clone algorithm</a> is both “<em>a wonder</em>” for <em>JS</em> primitives and “<em>a curse</em>” for developers.</p><h3>The Wonder</h3><p>One could transfer, store or clone almost everything that the platform provides, most notably very complex primitives such as entire files, personal storage access credentials, buffers or views of all kinds and whatnot.</p><p>There are just a few limitations that, differently form <em>JSON</em> that silently ignores or loses data while stringifying, would throw hard at runtime if found within the structured data:</p><ul><li>no functions allowed (reasonable: functions have scope and context access that cannot be brought elsewhere)</li><li>DOM nodes (also reasonable: workers, as example, have no DOM and surely not the same one live on the main thread)</li><li>accessors and/or private fields + some special property or prototype chain fully ignored</li><li>symbols will throw … but also will <strong>Proxies</strong> !!!</li></ul><h3>The Curse</h3><ul><li><strong>No Proxies</strong> means that most complex projects need to implement their own logic to avoid passing special references around, including foreign interface based wrappers that are the only possible kind of identity that any <em>WASM</em> targeting programming language can offer to interop with <em>JS</em> … that means: you have a <em>Python</em> <em>Proxy</em> around that contains just some data and you want to postMessage({ some: py_proxy_map }) somewhere else? Nope ☠️</li><li><strong>No Classes</strong> (part 1) means that even if you extend native classes with extra sugar on top, nobody can distinguish that special taste elsewhere, even if the very same library and classes used at the origin of the message are available and identical at the message target world 🤦</li><li><strong>No Classes</strong> (part 2) also means that circumventing the fact no functions can travel is not possible because only derived native classes instances can travel: forget about your logic being able to be transferred anywhere else 🥲</li><li><strong>No Classes</strong> (part 3) also means that accessors cannot sit where they belong, which is the prototype, because if they do that data won’t travel but if they don’t the accessor logic and/or reactivity will be lost 😱</li></ul><h4>The Common (non) Solution …</h4><p>There are literally dozen attempts to solve this very same issue at the serialization level, providing libraries that accept extensions where once gazillion of instanceof operations are performed, if some of the registered one returns something, then such “<em>something</em>” will be serialized among the rest of the data but in all my benchmarks that dance is a slow and bloated overhead I really don’t want to deal with anymore:</p><ul><li>requires both serialization and deserialization</li><li>requires a lot of callbacks invocation for extensions of all kinds</li><li>it’s not always provided a way to retrieve instances back on the other side</li><li>it’s most of the time fully focused on cross Programming Language portability while I want to solve in <em>JS</em> my <em>JS</em> issues and get rid of any unnecessary abstraction that makes the crytical path way slower than it could</li><li>it produces anyway something that needs to travel via postMessage and received via message handlers on the other side, when the project is <em>JS</em></li></ul><p>Don’t get me wrong, projects such as <a href="https://github.com/msgpack/msgpack-javascript">MessagePack</a> or <a href="https://github.com/kriszyp/cbor-x">cbor-x</a> are fantastic but these don’t solve my specific issue: I want to send <em>JS</em> references to a <em>JS</em> target and I don’t always need a binary overhead because binary serialization makes sense only with synchronous Atomics.wait when every other case is better off with just async dances that won’t need 3rd party libraries or binary serialization to work!</p><h4>The JSON limit …</h4><p>I can hear already people thinking “<em>mate, just use JSON then</em>” but that misses the points in so many levels:</p><ul><li>toJSON() escape hatch won’t get triggered by structured clone + it does not provide a way itself to revive whatever was returned after</li><li><em>JSON</em> is not able to be recursive data and any recursive capable alternative is not nearly as fast as JSON is, surely not faster than structured clone</li><li>all complex primitives need to be serializaed in a JSON friendly format where if you pass a buffer as Uint8Array, all its keys from 0 to its length will also pass through the optional callback you passed thinking you were smart in there … it’s instead a slippery slope to slowness unfortunately not many out there realize: as soon as any callback to serialize or parse back is provided, goodbye performance!</li></ul><p>In short, <em>JSON</em> remains the fastest and preferred way to deal with simple data + it never throws if not when unexpected recursion is passed along but it’s definitively not a solution, although many used it to solve the Proxy issue by providing normal data via toJSON, when accessed, yet we have no way to have the cake (structured clone) and eat it too (a better mechanism than toJSON to both serialize and deserialize).</p><h3>My Proposal @ WHATWG</h3><p>I wasn’t joking when <a href="https://github.com/whatwg/html/issues/7428#issuecomment-2888486503">I’ve said it was my birthday wish</a>, ’cause if there’s something that is driving me crazy and it’s extremely infuriating with the kind of projects I’m dealing with daily (WASM driven PLs hooked into JS via main or worker threads), is that inability to intercept the structured clone internal (recursive capable) data crawling to provide hooks that would let me decide how any <em>Proxy</em> around users’ code could be transformed or reflected elsewhere through a way that can be restored on the other side of the affairs.</p><p>In my case, what travels can be anything that the <a href="https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Reflect">Reflect namespace</a> can handle, via postMessage orchestration and SharedArrayBuffer that handle via synchronous cross-realms communication even DOM nodes from the main thread, nodes that can be passed around as method arguments (think just an element.appendChild(other) that happens within a worker) and whatnot!</p><p>The limit in my case is not even my immagination, it’s simply the lack of a better way to deal with this … so here I come with a proposal: <a href="https://github.com/WebReflection/serialization-registry#readme"><strong>a SerializationRegistry namespace</strong></a> to rule them all!</p><pre>class Serializable {<br>  static revive([count, data]) {<br>    const ref = new this(data);<br>    ref.#count = count;<br>    return ref;<br>  }<br>  // private properties via reviver? ✅<br>  #count = 0;<br>  #data;<br>  constructor(data) {<br>    this.#data = data;<br>  }<br>  // accessors? ✅<br>  get access() {<br>    return this.#count;<br>  }<br>  get data() {<br>    this.#count++;<br>    return this.#data;<br>  }<br>  // define how to travel? ✅<br>  get [SerializationRegistry.symbol]() {<br>    return SerializationRegistry.transfer(<br>      &#39;my-project@Serializable&#39;,<br>      [this.#count, this.#data],<br>    );<br>  }<br>}<br><br>// define how to revive? ✅<br>SerializationRegistry.register(<br>  &#39;my-project@Serializable&#39;,<br>  Serializable.revive.bind(Serializable),<br>);</pre><p>OK then, we have our very own class that:</p><ul><li>defines a SerializationRegistry.symbol accessor, just the way Symbol.toStringTag or others work so that it’s clear no argument would ever be passed while cloning, which will be accessed while cloning</li><li>that accessor returns explicitly something to transfer that assumes the class has been registered with that unique identifier, either in this world or in the receiving one</li><li>SerializationRegistry.register registers that unique identifier so that this class, as module, would work to both send or receive its own kind</li></ul><p>… and that’s it? Let’s see it in practice:</p><pre>const ref = new Serializable({ some: &#39;data&#39; });<br><br>// just to trigger the accessor and increment count<br>ref.data;   // { some: &#39;data&#39; }<br>ref.data;   // { some: &#39;data&#39; }<br>ref.access; // 2<br><br>// let&#39;s post that reference<br>postMessage({ extras: true, data: [1, ref, 2] });<br><br><br>// on the receiver side<br>self.onmessage = event =&gt; {<br>  const { extras, data } = event.data;<br>  const ref = data.at(1);<br>  console.log(ref); // instance of Serializable<br>  ref.data;   // { some: &#39;data&#39; }<br>  ref.access; // 2<br>};</pre><p>… how wonderful is that?</p><ul><li>we can just define when/where appropriate a way to both serialize and deserialize data</li><li>we don’t need to change anything else around the code</li><li>proxies can handle that <em>symbol</em> when accessed and never throw</li><li>no special IDs, properties, extra checks, extra crawling is needed to retrieve back, or send, data as we meant in our program</li><li>… profit for everyone?</li></ul><h4>Not just transferable …</h4><p>Of course the SerializationRegistry offers a way to register, unregister or transfer explicitly data but its current special symbol, which ideally could be instead a global Symbol.toStructuredClone so that it’d be detached from the registry logic (although used internally), allows to simply intercept clone intents and provide a substitute:</p><pre>const pythonHandler = {<br>  get(target, prop) {<br>    if (prop === SerializationRegistry.symbol) {<br>      if (target instanceof PythonProxy)<br>        return target.to_js();<br>    }<br>    return Reflect.get(target, prop);<br>  }<br>};<br><br>const ref = new Proxy(python_ref, pythonHandler);<br><br>structuredClone(ref); // it will not throw 🥳</pre><h3>As Summary</h3><p>It took me years of experience in the <em>Proxy</em>, <em>Atomics</em>, <em>Workers</em>, <em>MessageChannel</em>, <em>SharedArrayBuffer</em> w/ binary serialization or <em>FinalizationRegistry</em> field to land what seems to be an obvious and simple enough proposal to tame the most powerful, yet limited, API we have to send or clone data in JS and I would be more than happy to answer any question around this proposal but please, if you think it solves it all for you too, help me and everyone else working with <em>JS</em> and <em>Web</em> based primitives to move this proposal forward because it is really missing out there and it’s really bad that only the PL itself can decide what class can travel and whaat cannot … so thanks in advance to whoever will help us all to have a way to fix the structured clone related issues that keep affecting our less trivial projects 🙏</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=130608b69f47" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[A JS Buffer maxByteLength Solution?]]></title>
            <link>https://webreflection.medium.com/a-js-buffer-maxbytelength-solution-79123867e749?source=rss-cc83da4b8256------2</link>
            <guid isPermaLink="false">https://medium.com/p/79123867e749</guid>
            <category><![CDATA[typescript]]></category>
            <category><![CDATA[web-development]]></category>
            <category><![CDATA[javascript]]></category>
            <category><![CDATA[shared-array-buffer]]></category>
            <category><![CDATA[dataview]]></category>
            <dc:creator><![CDATA[Andrea Giammarchi]]></dc:creator>
            <pubDate>Thu, 27 Feb 2025 21:18:37 GMT</pubDate>
            <atom:updated>2025-02-27T21:23:29.330Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*gg9Vcpe-FdF3f6UYXoOeXg.jpeg" /><figcaption>Photo by <a href="https://unsplash.com/@elevatebeer?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Elevate</a> on <a href="https://unsplash.com/photos/person-holding-amber-glass-bottle-w9_XGvzxvxo?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Unsplash</a></figcaption></figure><p>When low-level Web APIs are based on <em>guestimates </em>you already know what the outcome could be: a (very likely slow) mess!</p><p>Meet <a href="https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer/maxByteLength">maxByteLength on MDN</a> to know more what is this about …</p><h4>Why not <a href="https://en.cppreference.com/w/c/memory/realloc">realloc</a>?</h4><p>Here the thing: in order to resize an ArrayBuffer or a SharedArrayBuffer we need to provide upfront a maxByteLength parameter which is based on … assumptions, nothing else. There’s no way to explain what that value is, what a good amount would be, we have <em>hints </em>from the Web it should better not exceed <em>1GB </em>of data, where NodeJS apparently has a <em>2GB</em> cap limit and yet, this amount is strictly platform dependent because on constrained environments, even asking for <em>1GB</em> could fail, due <em>limited amount of RAM</em> inferior to such limit, or already overwhelmed by the rest of the system.</p><p>Put in the fact a library that would like to provide best effort/performance over a <em>Raspberry Pi</em>, as well as a <em>128GB RAM based server</em>, is incapable of deciding a best effort on spot and that’s it: an API that backfires on users’ intents as opposite of providing what has been around forever in the programming field, <a href="https://en.cppreference.com/w/c/memory/realloc"><strong>realloc</strong></a>!</p><h4>Super slow via SharedArrayBuffer</h4><p>Trust me when I say that a resizable ArrayBuffer is going to be 2X, up to 5X, faster than a resizable SharedArrayBuffer, up to the point you wonder if using just duplicated memory to then fill up the <em>shared</em> one would outperform just having a resizable shared one around … and you’d be surprised if you perform only a single grow call on that one and fill its bytes via a <em>view</em> set instead of working directly with that <em>shared</em> one … but you duplicated needed <em>RAM</em> so you’ll probably fail at that point anyway?</p><h4>Benchmark</h4><p>I know you don’t want to take my rant for granted, I wouldn’t neither, so here some numbers:</p><pre>RESIZABLE BUFFER<br>encode: 3.037ms cold<br>decode: 2.514ms cold<br>encode: 1.389ms hot<br>decode: 0.78ms  hot<br><br>RESIZABLE SHARED BUFFER<br>encode: 7.77ms  cold<br>decode: 3.093ms cold<br>encode: 4.477ms hot<br>decode: 2.298ms hot<br><br>MAGIC VIEW FIXED BUFFER<br>encode: 4.995ms cold<br>decode: 3.695ms cold<br>encode: 1.513ms hot<br>decode: 1.585ms hot<br><br>MAGIC VIEW RUNTIME BUFFER<br>encode: 5.089ms cold<br>decode: 3.892ms cold<br>encode: 2.08ms  hot<br>decode: 2.013ms hot<br><br>DATA VIEW DECODE<br>decode: 1.591ms cold<br>decode: 0.95ms  hot<br><br>FIXED BUFFER - REFERENCE<br>encode: 1.817ms cold<br>decode: 1.901ms cold<br>encode: 1.006ms hot<br>decode: 1.058ms hot</pre><p>Let me breakdown that for you:</p><ul><li>a <strong>RESIZABLE BUFFER</strong> is one that knows upfront what the buffer is going to be … “<em>magic guessing</em>” that might overflow the guessed max size</li><li>a <strong>RESIZABLE SHARED BUFFER</strong> falls into same previous category, providing way slower resizes in the making</li><li>a <strong>MAGIC VIEW FIXED BUFFER</strong> is there just to compare imaginary worlds, where you would use such utility and yet you already know the final size of the buffer, so that no resize is ever needed</li><li>a <strong>MAGIC VIEW RUNTIME BUFFER</strong> is what this post is about: a way to transparently re-grow the underlying ArrayBuffer so that you don’t need to think about any of this at all and <em>RAM</em> is preserved by all means</li><li>a <strong>DATA VIEW DECODE</strong> is what you would use to decode a pre-allocated ArrayBuffer via one way or another, it’s the native fastest decoding thing we have to date on the Web</li><li>a <strong>FIXED BUFFER — REFERENCE</strong> is there to represent how fast all of this could be if the DataView instance would have an already perfectly sized ArrayBuffer that can be used to both encode or decode, with enough <em>RAM</em> available and guaranteed on the running system … it’s the benchmark reference not by accident!</li></ul><h4>Analysis</h4><p>In an ideal world, we should never use resizable ArrayBuffer primitive because it’s slow on resizing, but that’s like “<em>crystal ball programming</em>”.</p><p>On the other hand, the fastest alternative is to use a resizable ArrayBuffer but that might suddenly go “<em>out of bounds</em>” if we had no idea what the size of the “<em>thing</em>” we were going to encode would be: it cannot resize on demand, it checks the system behind the scene before agreeing the maxByteLength size is reasonable, without giving us any way to retrieve such heuristic.</p><p>Further down we have a deadly slow SharedArrayBuffer primitive that cannot compete by any mean with generic ArrayBuffer performance. The reason we use this primitive is to have a <em>shared</em> reference we can <em>await</em> on when bytes have been filled, but discovering such primitive is so slow at changing might be a bottleneck or a show-stopper already.</p><p>As the common case is to <em>fill data</em> into your view, I have created a thin abstraction that provides all DataView methods over an instance able to track changes and, only when needed, create a new buffer after transfering the previous one internally, so that there will be enough room for extra data as long as the system has <em>RAM</em> available, bypassing the need to know how much memory is available in there.</p><p>This module is known as, or called, <a href="https://github.com/WebReflection/magic-view#readme"><strong>MagicView</strong></a>, and it’s my current attempt to forget about all these constraints we have around <em>RAM</em> based <em>APIs</em>, allowing with nonchalance all DataView related operations plus the ability to set any dynamic typed array value or even an array of numbers, as long as those numbers are within the <em>uint8</em> boundaries.</p><h3>MagicView in a nutshell</h3><ul><li>it’s a DataView abstraction that lets you forget about memory contraints</li><li>it’s ideal to keep filling up incrementally the ArrayBuffer with data</li><li>the resulting buffer can be used to decode anything via native DataView or even Uint8Array capabilities</li><li>you can use its extra setTyped, getTyped, setArray and getArray methods when convinient for your use case (MessagePack or Buffered Clone like related fields)</li></ul><p>So that is basically it: one day we’ll have a “<em>just grow as needed</em>” primitive or method that does the right thing in a way developers can stop guessing target HardWare capabilities and memory availability; today that simplfication is represented by this module and as benchmark states, it’s a wonder to deal with while encoding, because it’s nearly as fast as any other native alternative and more than twice as fast when it comes to SharedArrayBuffer equivalent to encode data.</p><p>Enjoy 👋</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=79123867e749" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[TypeScript: when it helps & when it’s worse!]]></title>
            <link>https://webreflection.medium.com/typescript-when-it-helps-when-its-worse-9acfe6301220?source=rss-cc83da4b8256------2</link>
            <guid isPermaLink="false">https://medium.com/p/9acfe6301220</guid>
            <category><![CDATA[typescript]]></category>
            <category><![CDATA[javascript]]></category>
            <dc:creator><![CDATA[Andrea Giammarchi]]></dc:creator>
            <pubDate>Fri, 10 Jan 2025 16:08:18 GMT</pubDate>
            <atom:updated>2025-01-10T16:55:06.500Z</atom:updated>
            <content:encoded><![CDATA[<p>I love <em>TS</em> until the point I hate it and this post is about the latter part.</p><h4>Community Expectations</h4><p>Well … Yes! Like many other core libraries’ developers that would avoid <em>TS</em> for reasons I’ll explain in here, among others, we all want to publish on <em>npm</em> something that can provide a great <strong>DX</strong> all over the developers’ space: targeting people sticking with vanilla <em>JS</em> or people using <em>TS</em> as default, and it’s all doable behind the scene!</p><p>This post is about exploiting just one of (hopefully not) many things <em>TS</em> makes you believe your code is better, or safer, while it’s not, actually the opposite … and from a developer (me) that, during his <em>certified</em> <em>PHP</em> days was all about “<em>strict</em>” (or even <em>all</em>) error reporting around the code, trust me: this is not looking good at all, especially when <em>TS</em> enthusiasts don’t understand the underlying issue!</p><iframe src="https://cdn.embedly.com/widgets/media.html?type=text%2Fhtml&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;schema=twitter&amp;url=https%3A//x.com/Benjamin_Aster/status/1877700712286519423&amp;image=" width="500" height="281" frameborder="0" scrolling="no"><a href="https://medium.com/media/e0bade76945e7848cc0d5d2b335f62f0/href">https://medium.com/media/e0bade76945e7848cc0d5d2b335f62f0/href</a></iframe><h3>A Failing Use Case</h3><p>You can <a href="https://www.typescriptlang.org/play/?#code/MYewdgzgLgBA7gWxgXhmApnGB1dBDAawFk8AHAHgHkAjAK3WCgBpoAnASzAHMBtAXQB8AbgBQoSLALoAnihgBvAL6iR7AGYwAFAEJEAOgAWeCJqnSAlOZEx4CPRHRRTMpjH7nR+ro+cW9pAFcIA00AcmwAVQAVAH5QjxEgA">try yourself this simple case over TypeScript Playground</a>:</p><pre>const wm = new WeakMap&lt;Object,string[]&gt;;<br>const key = {};<br><br>if (!wm.has(key))<br>  wm.set(key, []);<br>wm.get(key).push(&#39;WUT?&#39;);<br>// ^^^^^^^^ oh, &quot;the horror&quot;!</pre><p>You can play with strict VS non strict behavior and see that the error will always be the same:</p><pre>Object is possibly &#39;undefined&#39;.</pre><h4>And What Could Possibly Go Wrong …</h4><p>First of all, from a single-threaded <em>PL</em> like <em>JS</em> is, nothing at all could possibly go wrong … unless:</p><ul><li>the guard is about using a potentially poisoned get method of a Map or a WeakMap randomly used in the code (<em>spoiler</em>: it’s not!)</li><li>the guard is about <em>TS</em> <strong>not</strong> understanding the context around that WeakMap or Map in the wild … 🤔</li></ul><p>Not sure you really needed to think about these options, but the answer is … 🥁🥁🥁 … the latter! 🥳</p><p>If a .get can fail, so can any .has or .set accordingly, but most importantly, in there I wrote code I want to be sure about that if such .get fails, I expect a “<em>poisoned environment, GTFO or bad things will happen!</em>” sort of instantly throwing message … right? <strong>… right?</strong></p><p>I don’t want that code to silently fail and moveover because I trust I can find what I have been looking for (literally 2 LOC before, but the logic breaks further) … and I would never expect any error otherwise, accordingly to the <strong>logic I explicitly wrote</strong>!</p><h4>Enter TS Misleading Safety Feeling</h4><p>Most of the TS lovers still reading this, would just fix that issue with an “<em>innocent</em>” ?, isn’t it?</p><pre>// previous code ...<br>wm.get(key)?.push(&#39;WUT?&#39;);</pre><p>… fair enough, now I am going to ask you: what did that change to your original intent in writing the code the way you meant to write it?</p><p>Here some clues:</p><ul><li>that code will silently fail in there, keep going and doing things after instead of throwing in case that .get method was poisoned by evil code around (that is WeakMap or Map .prototype.get to be clear)</li><li>my code trusts no evil code is around (bad for libraries authors, still the most common case in the wild) so I just want the <em>TS</em> compiler to be happy … writing meaningless and potentially less predicable code in the making … (congrats? when that will become a habit? 🙃)</li></ul><p>Again, fair enough if you are OK with the second clue, yet … have you read, and understood, the first one at all?</p><p>Basically what we are saying is:</p><blockquote>it’s OK if TS makes me write code that is more prone to 3rd party evil attacks and bugs, as I cannot guarantee anymore my expectations would fail fast when needed, i.e. when a poisoned or broken .get happens!</blockquote><p>… but wasn’t the whole premises of <em>TS</em> about writing better code?</p><h4>Enter @ts-ignore</h4><p>The moment I need to write that comment anywhere in my otherwise fully valid <em>JS</em> code, is the moment I start questioning what <em>TS</em> <strong>really</strong> brings on the table … to me it starts falling into one of these categories:</p><ul><li>I use <em>TS</em> to help me refactoring later on (likely the only use case I agree about)</li><li>I use <em>TS</em> because it makes me write better code (honestly the worst use case I agree about)</li><li>I use <em>TS</em> because everyone else uses it (fair, in terms of job-market thinking, yet … not fully satisfying?)</li></ul><p>Wondering about me? … Well, I am more about:</p><ul><li>I use <em>TS</em> because the community asks for it and my IDE understands <em>JSDoc TS</em> so that occasionally I have hints instead of shenanigans created by <em>TS</em> itself I need to fix later</li></ul><p>… and that’s pretty much it, yet happy when it works!</p><h3>Personal Conclusions</h3><p><em>TS</em> is there to stay, or even take over the <em>JS</em> world, but it has still tons of things to improve and I feel like it’s still in its early stages, despite all the great things it improved over recent years.</p><p>My personal thinking though, is why anyone would chose to make a powerful scripting language less powerful, and more error prone, due intrinsic conflicts between being strictly typed and naturally being “<em>just a scripting language</em>” (if not the best one) like <em>JS</em> is … I feel like everyone claims successful stories around how better <em>TS</em> made their daily workflow but nobody is able to admit how many times that workflow is stubbornly stuck behind, or slower due, <em>TS</em> incapability to really understand the <em>JS</em> code expectation and logic underneath, like I’ve shown in this post.</p><p>Once again, I love <em>TS</em> and what it does for me on daily basis, when I try to make it work out of <em>JSDoc TS</em> awesome integration, but it is in these cases that I wonder if I’m wasting my time instead, using directly or inderectly a project that feels not fully there yet.</p><p>Curious to know about your feelings around this topic too though and please, no hate, just sharing, thanks 👋</p><p>Closing with this gem: folks, it’s not about how strict or not strict <em>TS</em> is, it’s about <em>TS</em> <strong>not</strong> understanding the code where strict checks make it just more obvious <em>TS</em> <strong>does not</strong>, in fact, <strong>understand the surrounding code</strong>!</p><iframe src="https://cdn.embedly.com/widgets/media.html?type=text%2Fhtml&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;schema=twitter&amp;url=https%3A//x.com/WebReflection/status/1877760392845197325&amp;image=" width="500" height="281" frameborder="0" scrolling="no"><a href="https://medium.com/media/f1a5cc0980e440ae7f8d77a8ceecf5ec/href">https://medium.com/media/f1a5cc0980e440ae7f8d77a8ceecf5ec/href</a></iframe><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=9acfe6301220" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[MDN doesn’t trust you, should you trust MDN?]]></title>
            <link>https://webreflection.medium.com/mdn-doesnt-trust-you-should-you-trust-mdn-93fce7768076?source=rss-cc83da4b8256------2</link>
            <guid isPermaLink="false">https://medium.com/p/93fce7768076</guid>
            <category><![CDATA[web]]></category>
            <category><![CDATA[security]]></category>
            <dc:creator><![CDATA[Andrea Giammarchi]]></dc:creator>
            <pubDate>Mon, 14 Oct 2024 16:09:59 GMT</pubDate>
            <atom:updated>2024-10-14T19:55:20.848Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*-al3Ucv5y5hPDVqj8lgKwg.jpeg" /><figcaption>Photo by <a href="https://unsplash.com/@bernardhermant?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Bernard Hermant</a> on <a href="https://unsplash.com/photos/round-red-and-white-trust-signage-OLLtavHHBKg?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Unsplash</a></figcaption></figure><p>There are <a href="https://www.quora.com/If-someone-doesnt-trust-you-should-you-trust-them-Should-trust-be-established-mutually">some lovely Quora’s answers</a> regarding this topic, and I’d like to already break the ice around the fact <a href="https://developer.mozilla.org/en-US/">MDN</a> has been the <em>go-to</em> website when it comes to Web specifications, documentation, examples, broader context around APIs or experimental features and whatnot, so that if you are here to “<em>hate</em>”, I want to clarify that MDN remains (imho) the most trustworthy reference for Web developers out there these days … until they are not though, hence this post.</p><h4>some background</h4><p>While many on X that don’t know me at all described the topic I am going to talk about as “<em>ego issue</em>”, these are some historical facts around me and my 25+ years experience on the field:</p><ul><li>I am an Open Source and Open Web believer since my first days of Web development: I’ve learned from OSS and I’ve always tried to give back</li><li>I have written polyfills to move the Web forward since <em>IE4</em> times (yeah, I am 46 old, and I contribute as I can since I was 20 or even earlier), but if you don’t believe me, I’ve written <a href="https://webreflection.blogspot.com/2006/11/my-domcontentloaded-final-solution.html">polyfills</a> before the term even existed for browsers like Firefox 1 and others around at that time … please don’t stop there though, me writing polyfills has been a thing up to today and FAANG to startups used my code in production too</li><li>I did write some whole page on MDN in the past plus I did contribute in more recent times, due time constraints, amending here and there on occasions … <a href="https://github.com/mdn/content/pull/36285">even recently</a></li><li>as annoying as I could be on (hopefully rare) occasions, when I disagree around some topic, my history of working experience from FAANG to startups can probably confirm I am also perfectly capable of letting it go and quickly move forward over disagreements or even suggest the best outcome for everyone out of such disagreement</li><li>I have sporadically contributed to TC39 (ECMAScript / JS) specifications, or some WHATWG idea, plus I collaborated with Igalia to help them bring in the CSS :has(...) selector (and other less popular topics)</li></ul><p>“<em>But dude, why should I care about all this?</em>” You are right, I might be “<em>Mr Nobody</em>” to you, but I think in the specs, and Web field in general, I probably or hopefully gained some trust: not as the best dev, surely not as the best dev to deal with, hopefully as one that has records of trusted contribution to the Web itself without ever causing issues with his popular, up to hundreds million downloads per month, OSS ideas.</p><p>And yet, MDN wouldn’t care a bit about me, my previous work, my history there, my contribution to their browser in the past, or the fact I also do OSS like they do, with tests, coverage, and cross-browser or cross-engine intents, because this is the answer I’ve got from my recent (double) <a href="https://github.com/mdn/content/pull/36294#issuecomment-2407401296">PR</a>:</p><blockquote>We generally reject links to one’s own work</blockquote><p>If you are still reading, I’ve tried to explain that it’s not like I want a special throne on the Web or anything, but from there to be a “<em>generally rejected link</em>” there are oceans:</p><ul><li>what does “<em>generally</em>” mean in there? where is the explanation of what it takes to be excluded from that rule? … <em>crickets</em>!</li><li>why is a project born to be a community project that doesn’t even use my GitHub namespace, such as @ungap is, considered “<em>one’s own work</em>”?</li></ul><h3>Meet @ungap project</h3><p>The <a href="https://ungap.github.io/">@ungap project</a> is a community related effort to bring mostly newly spec’d APIs to the browsers, in a “<em>best effort</em>” way that doesn’t include all the bloat other polyfills would include by default.</p><p>Quoting the project’s goal:</p><blockquote><strong>Pragmatic is better than (im)perfect</strong></blockquote><blockquote>There are parts of the specifications that are very hard, if not impossible, to polyfill. The main purpose of this project is to help developers move forward, and possibly without unnecessary bloat. This basically means that polyfills are written to support 99% of the use cases, without granting 100% spec compliance.</blockquote><blockquote>If you need that, which again is basically impossible in most of the cases, feel free to keep using whatever monolithic polyfill or approach you were using before.</blockquote><p>We all rant about JS bloat here and there and most don’t even realize the moment they use any transpiler that bloat is included by default to help them writing otherwise not necessarily usable, out of the box, code in the wild … and here <a href="https://github.com/zloirock/core-js">@core-js</a> plays a wonderful (no irony intended) role:</p><ul><li>it’s used by Babel and other transpilers, so that even if you don’t know about it, it’s likely part of your code-base or toolchain</li><li>it’s under a single person repository, it’s not an organization or a collaborative place: you need to file issue to that original repo in order to get anything approved</li><li>it’s paranoid about JS pollution, so it includes all over the place its internals. This is not a bad thing in general, but it’s free <strong>repeated bloat</strong> for anything you need or use from that repository</li><li>because of the previous point, if you include <strong>2 ponyfills</strong> separately from core-js, your bundle size will <strong>double</strong> out of the box <strong>for no reason</strong> and, most importantly, <strong>no extra security warrants</strong></li></ul><p>But here is the catch:</p><ul><li>core-js is popular because popular bundlers use and trust it and MDN promotes it</li><li>core-js code-base is defensive by design, but it’s objectively as vulnerable as anything else on the Web, because the moment some evil script manages to pollute the env, if core-js is loaded after, in a module, or lazily, it’s a doom field like any other JS script that exists to date (until native import from ‘esm:Object’ lands in browsers and no bundler would dare touching it)</li><li>because of all previous points, <strong>contributing to core-js is a giant effort</strong>, because learning all the ways core-js works, to grant that pseudo-feeling about security, and deliver, requires a lot of time to investigate or a lot of time in fixing whatever PR lands in there: you know it already? maybe easy … do you just want to move forward about some recent spec? Good luck there, I value my time more than ever these days (full-time employee + father of 2.5yo kid)</li></ul><p>So, beside the fact I, as extra fact, value my time, the time it would take me to change current core-js polyfill around anything would probably mean a week of work, if lucky, while solving the problem my way would take, depending on the task, of course, half to 3 hours for an MVP that solves already whatever I needed to solve …</p><h3>Broken MDN metrics</h3><blockquote>We generally want some proof of popularity (downloads, stars…) before committing to suggesting it</blockquote><p>Let’s recap what happened in here … I’ve proposed an <a href="https://github.com/ungap/raw-json">alternative</a> <strong>ponyfill</strong> for a specification that nobody even knows it exists and the popularity argument has been used as a reason to close my effort to contribute?</p><p>I smell a catch 22 situation here or favoritism all over the place:</p><ul><li>you promote only core-js in there: how is anyone else even able to contribute by adding smaller, faster, yet still as safe, alternatives that could possibly be more popular?</li><li>how can you disclose, ignore, destroy history around, the person trying to contribute for the community, one that contributed already a few times, one that has collaborated with standards here and there, one that is trying to help the community back by stating “<em>here there’s a ponyfill that you can try with ease without breaking or slowing down everything around its usage</em>” ?</li><li>how can you make, as guideline for experimental related contributions, your decision on metrics for something maybe landed the day before and something nobody knows at all existed before?</li></ul><p>So here I got triggered, because none of the above points make sense to me … maybe the fact the reviewer had no idea about myself is even OK, but what is the nonsense around this popularity point, when MDN publishes non existent in standard (yet) features and ignores anyone maybe even excited about such feature, or one like me that needs that feature daily, that even spent time to provide via a community project a solution that doesn’t cause bloat or global slow-down for every JSON related operation?</p><p>I believe the answer in there, and forever, would be <em>crickets</em>, because once again, while everyone on X has been fast enough to blame my ego, few understood that there is an overall issue with all the reasons the PR has been closed out of the box: no discussion ever started … “<em>I suck</em>”, that’s it!</p><h3>The FUD around Security</h3><blockquote>We are particularly wary about links to polyfills, because (a) they will eventually be removed in favor of native solutions (b) they represent one of the most vulnerable attack vectors for supply chain attacks. Therefore, we have by convention decided that we will only include core-js polyfills.</blockquote><p>This is the real reason I am here to talk about <a href="https://github.com/mdn/content/pull/36294#issuecomment-2407401296">this review process</a>:</p><ul><li><em>polyfills</em> will be removed, and so will <em>ponyfill</em>. The MR in charge of doing that, once the time is right, is exactly <strong>one</strong> … so that my extra link wouldn’t really have bothered anyone in the future; it’s not that I added maintenance burden, a topic I would’ve paid more attention to if presented</li><li>they don’t have any process in place to validate that core-js polyfill actually works as standards meant; they just accepts without a doubt that core-js works … if they had such process, other links would’ve been welcomed, or rejected by CI, because it means they tested the standard behavior, and can make an informed decision about accepting, or rejecting, that PR. Here they just decided any link that solves, in possibly a better way, the issue is not worth it, but no process exists to make sense of that decision …</li><li>they are assuming me, my account, my OSS, is vulnerable, without providing an explanation around “<em>why is that</em>” or “<em>what can I do to make it less vulnerable</em>” … have I learned anything around that PR and my effort to improve the DX around this newly introduced API? <strong>No</strong>, nothing at all, all I know is that <strong>my code is considered vulnerable out of the box</strong>, thank you MDN!</li><li>they tried to make the security argument later on, not while closing the PR, and that is even worse than the rest of this story … keep reading …</li><li>there is a convention that established core-js is not vulnerable, and boy if I have links around that …</li></ul><p>First of all, I already wrote a post around the fact <a href="https://javascript.plainenglish.io/about-trusting-javascript-execution-8c6b478d6021">nobody can trust JS execution</a>, it’s by JS design, it doesn’t matter how paranoid is the stack you are running, the moment my evil.js script has a chance to <a href="https://github.com/zloirock/core-js/blob/master/packages/core-js/internals/function-call.js">run before your trusted core-js internals</a>, you are doomed, trust me!</p><p>I wrote <a href="https://github.com/WebReflection/proxy-pants#readme">libraries that help mitigating this issue</a> as long as these run before evil.js, that’s the contract: first run, first safe, there is no “<em>but</em>” here!</p><p>It’s not that I am new to the security concerns world neither, I have been working with security-concerned companies (as security service) where I could taint a whole environment to provide security concerns related issues for production sites … again, not my ego, it’s just a matter of facts that I know JS inside out and I wrote about security concerns “<em>forever</em>”.</p><p>Moreover, the moment you discard any community contribution in the name of security, implicitly inferring that the contributor is not a trusted JS developer, is the moment you are vomiting out anyone out there that actually has a way to contribute to the community by providing code that understands these concerns and provide better solutions that still work in best conditions, just like core-js does if no evil.js file has been landed on the page before its execution.</p><p>So here I might have lost my patience, answered badly, you name it in there, but if you make security a joke and you blame security on others, you are doing a disservice to anyone reading that thread, anyone believing core-js is infallible, anyone trying to learn what security means on the Web.</p><p>Security means you are behind CSP, you trust and validated every single script that lands on your page, including those 3rd party Ads related scripts, and you are sure by code validation before going in production that no script, unless trusted, would ever try to pollute global prototypes ahead of time … <strong>now that’s security</strong>, not your cheap and easy blame on ad-hoc issues created for the sake of creating those issues, because once again, I wrote “<em>evil</em>” code that can feel native, once introspected, by all means and others did the same before me! core-js is not immune and nobody should believe that using core-js means no security risks can possibly exist on the Web.</p><h3>As Summary</h3><p><strong>Don’t trust MDN around security concerns</strong>: they have way more work to do before developers could be really informed around it and the fact they promote only a single <em>polyfill</em> on their documentation is rather an increased security risk (ad-hoc core-js evil.js as single point of failure) than a guard for Web developers.</p><p>This was the post, this was my rant around it … security has no compromises, if you care about security, and you should, that Pull Request is not the place you’ll learn anything about it, quite the opposite: it’s making you feel safer about their choice while you are not.</p><p>The rest of that discussion went from ugly to unbearable to me, so once again, I might have over-reacted, but I do take security concerns seriously and that was not the case in there .. and on top of that, my presented alternative stats:</p><ul><li>it doesn’t patch the global JSON primitive, hence it doesn’t make it slower by default, that’s what a <em>ponyfill</em> does</li><li>it’s as “<em>secure</em>” (accordingly to those broken metrics used in there) as core-js</li><li>it’s 90% smaller as plain text, 60% smaller once minified</li><li>it’s <a href="https://ungap.github.io/raw-json/test/">3x up to 5x faster than core-js</a> once minified and gzipped</li></ul><p>This is <a href="https://github.com/ungap/raw-json#readme">raw-json</a>, a ponyfill MDN doesn’t want you to use or try … and this is the end, I hope I’ve given back to OSS something, one more time 💕</p><h4>Update</h4><p>MDN <strong>locked</strong> my issue and <strong>my account</strong> after also pointing me at other 3 locked discussions … and when did that happen? After I’ve invalidated all arguments in there providing fixes to the mentioned “<em>security</em>” concerns and benchmarks about code size and performance where my <em>ponyfill</em> wins by far compared to the <em>polyfill</em> they propose to all Web developers.</p><p>That’s Open Web “<em>at its best</em>”, isn’t it?</p><p><a href="https://github.com/mdn/content/pull/36294#issuecomment-2411767537">https://github.com/mdn/content/pull/36294#issuecomment-2411767537</a></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=93fce7768076" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[On JS Closures and Leaks]]></title>
            <link>https://webreflection.medium.com/on-js-closures-and-leaks-74e523124e15?source=rss-cc83da4b8256------2</link>
            <guid isPermaLink="false">https://medium.com/p/74e523124e15</guid>
            <category><![CDATA[closures-functions]]></category>
            <category><![CDATA[javascript]]></category>
            <category><![CDATA[memory-leak]]></category>
            <dc:creator><![CDATA[Andrea Giammarchi]]></dc:creator>
            <pubDate>Thu, 01 Aug 2024 16:14:18 GMT</pubDate>
            <atom:updated>2024-08-01T18:03:40.948Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*-YAyh4ZWl1guCEhtRs47vg.jpeg" /><figcaption>Photo by <a href="https://unsplash.com/@daanmooij?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Daan Mooij</a> on <a href="https://unsplash.com/photos/water-coming-out-from-gray-pipe-91LGCVN5SAI?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Unsplash</a></figcaption></figure><p>If you haven’t read (entirely) <a href="https://jakearchibald.com/2024/garbage-collection-and-closures/">this Jake’s post</a> you should.</p><p>Now hear me out: the moment <em>JS</em> developers would need to track or <em>nullify</em> variables as if they are writing <em>Rust</em> is the moment any reason for scripting to exist is basically dead … so I am not going down that road, I am taking a tangent to this issue.</p><h3>Avoid closures when not needed</h3><p>As simple and silly as this might sound, every time you write this:</p><pre>const outer = { huge: &#39;reference&#39; };<br>setTimeout(() =&gt; {<br>  console.log(outer);<br>}, 1000);</pre><p>you are better off with this:</p><pre>const outer = { huge: &#39;reference&#39; };<br>setTimeout(console.log, 1000, outer);</pre><p>And every time you write this:</p><pre>class Counter {<br>  constructor(element) {<br>    this.i = 0;<br>    element.addEventListener(&#39;click&#39;, () =&gt; {<br>      console.log(this.i++);<br>    });<br>  }<br>}</pre><p>you are better off with this:</p><pre>class Counter {<br>  constructor(element) {<br>    this.i = 0;<br>    element.addEventListener(&#39;click&#39;, this);<br>  }<br>  handleEvent(event) {<br>    console.log(this.i++);<br>  }<br>}</pre><p>I wrote like 7 years ago why latter pattern matters, beside this blog post topic, and <a href="https://webreflection.medium.com/dom-handleevent-a-cross-platform-standard-since-year-2000-5bf17287fd38">you should read it</a> if you didn’t know about such pattern.</p><h3>Track your references with ease</h3><p>I have been dealing with memory leaks related topics for many years and recently I got that bar rised via <em>WASM</em> related projects where not leaking foreign PLs references is crucial.</p><p>Because I wrote tons of libraries to make leaks less possible, I also updated recently the most important one, the one that deals with the <a href="https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/FinalizationRegistry">FinalizationRegistry</a>, which now exposes a handy utility to track collected references via console.debug right on your browser’s <em>devtools</em>.</p><pre>import BUG_GC from &#39;https://esm.run/gc-hook/track&#39;;<br><br>// HINT: use a constant so that rollup or bundlers<br>// can eventually remove all the dead code in production<br>// when the following constant is `false` instead<br>const D = true;<br><br>// create any reference<br>let test = { any: &#39;value&#39; };<br><br>// when debugging, pass an object literal to simplify<br>// naming -&gt; references convention<br>D&amp;&amp;BUG_GC({ test });<br><br>setTimeout(() =&gt; { test = null; });<br>// now press the Collect Garbage button in devtools<br>// and see the lovely message: **test** collected</pre><p><a href="https://github.com/WebReflection/gc-hook#readme">The gc-hook module</a> is 100% code covered and it’s been used in production for more than a year to avoid leaks from <em>WASM</em> targeting PLs to <em>JS</em> and vice-versa. Its /track export is just an utility that uses the module behind the scene and it will show via console.debug any reference that was collected. You can name references via <em>object literal</em> and read in <em>devtools</em> when these are collected. If you don’t read anything after you pressed the <em>Collect Garbage</em> button in your <em>devtools</em> it simply means that never happened, the end.</p><p>P.S. you need to enable verbose / all things in devtools to read console.debug in there … please double check before thinking your reference actually leaked!</p><h3>As Summary</h3><p>Apparently, due performance reasons, <em>JS</em> engines are refusing to fix a bug that is actually haunting the <strong>Web</strong> when it comes to memory consumption.</p><p>There are, however, easy ways to avoid closures leaks, such as:</p><ul><li>avoid closures when not necessary … that includes every setTimeout or setInterval that is not using extra arguments after the delay. If your counter-argument is that such practice requires extra memory to retain that outer scoped callback in memory try to do the math: is it better to have little extra overhead once or to have that callback overhead (the new closure each time) also leaking forever in your code?</li><li>avoid closures to re-assign methods in classes so that these can be used as listeners, there is a handleEvent pattern that works fast and better since year 2000</li><li>track references with ease through gc-hook/track module if you are worried something won’t get cleaned up</li><li>use the <em>React compiler</em> if you are using <em>React</em> as apparently that tries to mitigate the issue too</li><li>be mindfull of the unnecessary closures you create in your code … those are easy to write but apparently extremely difficult to digest behind the scene … nobody wins if each closure results into a leak (luckily, that’s not always the case, you should <em>track</em> that though)</li></ul><p>Happy <em>TS</em> or <em>JS</em> coding everyone 👋</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=74e523124e15" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[A SharedArrayBuffer Polyfill]]></title>
            <link>https://webreflection.medium.com/a-sharedarraybuffer-polyfill-0fd568c0061a?source=rss-cc83da4b8256------2</link>
            <guid isPermaLink="false">https://medium.com/p/0fd568c0061a</guid>
            <category><![CDATA[polyfill]]></category>
            <category><![CDATA[shared-array-buffer]]></category>
            <category><![CDATA[web-development]]></category>
            <dc:creator><![CDATA[Andrea Giammarchi]]></dc:creator>
            <pubDate>Tue, 02 Jul 2024 11:54:33 GMT</pubDate>
            <atom:updated>2024-07-02T11:59:43.609Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*n4xwrbjwXU21-ZRgc4NdbA.jpeg" /><figcaption>Photo by <a href="https://unsplash.com/@steve_j?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Steve Johnson</a> on <a href="https://unsplash.com/photos/red-and-silver-screw-driver-lH-UZuoG-aY?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Unsplash</a></figcaption></figure><p>That’s it, that’s the post, really … if you look for <a href="https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/SharedArrayBuffer"><em>SharedArrayBuffer</em></a> in search engines the only thing that comes out is related to issue with headers, <em>COI</em>, <em>CORP</em>, <em>COEP</em>, and all the acronymis nobody needing this primitive really cares about, so here I am telling you that I have managed to polyfill this primitive for all Desktop to Mobile browsers, and it’s published on both <a href="https://www.npmjs.com/package/sabayon">npm</a> and <a href="https://github.com/WebReflection/sabayon#readme">GitHub</a> under the name <strong>sabayon</strong> (<strong>S</strong>hared<strong>A</strong>rray<strong>B</strong>uffer <strong>a</strong>lwa<strong>y</strong>s <strong>on</strong>).</p><h4>… previously …</h4><p>I have <a href="https://webreflection.medium.com/about-sharedarraybuffer-atomics-87f97ddfc098">previously talked about the beauty and power of this primitive</a>, but it’s only over last weekend that I’ve decided to nail down an ochestration that “<em>just works</em>”<em>(™️)</em>, one that wouldn’t compromise security at all, or that requires special headers that browsers’ vendors can’t even agree about, or need <a href="https://developer.mozilla.org/en-US/docs/Web/API/Window/credentialless#browser_compatibility">iframe credentialless attribute</a> still in an implementation limbo when embedded foreign content is desired in websites that actually need <em>Atomics</em> and “<em>SAB</em>”.</p><h3>Sabayon — the nitty gritty</h3><p>Hopefully explained reasonably well in its <em>GitHub</em>’s <em>README</em> under the “<em>How does it work?</em>” detail, this module provides all related primitives to be used in either the main thread or the worker’s one.</p><p>A crypto secure unique identifier is used per each page or tab to ensure a safe communication across workers and main threads, and of course the <em>Shared</em> bit of the equation is a facade of an <em>ArrayBuffer</em> orchestration, but that’s fully transparent for this module’s users.</p><pre>// Worker example<br>import {<br>  Atomics,<br>  Int32Array,<br>  SharedArrayBuffer,<br>  addEventListener,<br>  postMessage,<br>  ignore,<br>} from &#39;sabayon/worker&#39;;<br><br>addEventListener(&#39;message&#39;, event =&gt; {<br>  const { handle, complex } = event.data;<br>  handle[0] = 1;<br>  Atomics.notify(handle, 0);<br>});<br><br>const sab = new SharedArrayBuffer(4);<br>const view = new Int32Array(sab);<br><br>postMessage({<br>  handle: view,<br>  passThrough: ignore({ complex: &quot;data&quot; })<br>});<br><br>Atomics.waitAsync(view, 0).value.then(_ =&gt; {<br>  console.log(&#39;view changed&#39;, [...view]);<br>});<br><br>Atomics.wait(view, 0);<br>console.log(&#39;view changed&#39;, [...view]);<br><br>// Main example<br>import {<br>  Atomics,<br>  Int32Array,<br>  SharedArrayBuffer,<br>  Worker,<br>  ignore,<br>} from &#39;sabayon/main&#39;;<br><br>const w = new Worker(&#39;./worker.js&#39;, {<br>  type: &#39;module&#39;,<br>  serviceWorker: &#39;./sw.js&#39;, // optional<br>});<br><br>w.addEventListener(&#39;message&#39;, event =&gt; {<br>  const { handle, complex } = event.data;<br>  handle[0] = 1;<br>  Atomics.notify(handle, 0);<br>});<br><br>const sab = new SharedArrayBuffer(4);<br>const view = new Int32Array(sab);<br><br>postMessage({<br>  handle: view,<br>  passThrough: ignore({ complex: &quot;data&quot; })<br>});<br><br>Atomics.waitAsync(view, 0).value.then(_ =&gt; {<br>  console.log(&#39;view changed&#39;, [...view]);<br>});</pre><p>Because the module is actually not obtrusive at all, what you’d get from it, if the right headers are around, is the native, untouched deal, except for waitAsync that is automatically patched in Firefox, as<a href="https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Atomics/waitAsync#browser_compatibility"> it apparently didn’t get the memo</a>, but if headers are not there, and <a href="https://github.com/WebReflection/mini-coi#readme">solutions to enable those headers</a> are not desired, you’ll get the whole thing just as if everyhing was fine to start with: a win-win situation 🥳</p><p>Enjoy the beauty of buffers based exchanges between workers and main thread without the hassle and/or implications these primitives have behind the Web scene 👋</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=0fd568c0061a" width="1" height="1" alt="">]]></content:encoded>
        </item>
    </channel>
</rss>