<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Dan Jenkins]]></title><description><![CDATA[Thoughts, stories and ideas.]]></description><link>https://dan-jenkins.co.uk/</link><generator>Ghost 5.86</generator><lastBuildDate>Tue, 21 Apr 2026 12:47:06 GMT</lastBuildDate><atom:link href="https://dan-jenkins.co.uk/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[The Journey to YouFibre]]></title><description><![CDATA[<p>TLDR: I&apos;ve just had YouFibre 8000 installed, it&apos;s great! There were many chapters to the story to get to the conclusion but I&apos;m there now, and all is good... mostly. But it is a long tale... hence the TLDR.</p><p>I&apos;ve recently had</p>]]></description><link>https://dan-jenkins.co.uk/the-journey-to-youfibre/</link><guid isPermaLink="false">67a8d439db6aa8005f05f0d0</guid><dc:creator><![CDATA[danjenkins]]></dc:creator><pubDate>Sun, 09 Feb 2025 21:04:02 GMT</pubDate><media:content url="https://dan-jenkins.co.uk/content/images/2025/02/Screenshot-2025-02-09-at-15.31.16-1.png" medium="image"/><content:encoded><![CDATA[<img src="https://dan-jenkins.co.uk/content/images/2025/02/Screenshot-2025-02-09-at-15.31.16-1.png" alt="The Journey to YouFibre"><p>TLDR: I&apos;ve just had YouFibre 8000 installed, it&apos;s great! There were many chapters to the story to get to the conclusion but I&apos;m there now, and all is good... mostly. But it is a long tale... hence the TLDR.</p><p>I&apos;ve recently had FTTP broadband from <a href="https://www.youfibre.com/?ref=dan-jenkins.co.uk" rel="noreferrer">YouFibre</a> installed but boy, has it been a journey. Heres the tale of that journey and some thoughts about things that could have gone better. Side note... it looks like their branding is &quot;youfibre&quot; but the company is YouFibre Limited. I&apos;ll be referring to them as YouFibre.</p><p>Back in March 2023 I replied to an <a href="https://www.ispreview.co.uk/?ref=dan-jenkins.co.uk" rel="noreferrer">ispreview.co.uk</a> tweet, it was about another ISP adding more towns to their planned FTTP rollout. My <a href="https://x.com/dan_jenkins/status/1639348943032049679?ref=dan-jenkins.co.uk" rel="noreferrer">response</a> called out all &quot;altnets&quot; on how they all seemed to mostly just use Openreach&apos;s PIA product and anything more complicated didn&apos;t get looked at; specifically calling out Netomnia because of their plans in my area. Netomnia were planning to build, or were already building in the area but checks on my postcode showed they weren&apos;t coming to me.... but were coming to properties 200m up the road. Their CEO, Jeremy, replied asking for my details so he could look into it.</p><p>This was the start of 2 years worth of conversations with Jeremy and the build team at Netomnia. The reason behind my road not being &quot;in scope&quot; was because of the &quot;JUP&quot;s on my road - or Joint Utility Pole. Openreach use UK Power Network poles along my road (and many others in my village) to provide service; without any poles or ducts available via PIA, Netomnia decided to put my road on &quot;indefinite hold&quot;. I had many conversations with Jeremy, who as CEO of both Netomnia and YouFibre really does go out of his way to talk to customers when they have issues and ultimately it looked like there may be a way to get things rolling but it wasn&apos;t to be for &quot;reasons&quot; - it sounded like a UKPN issue rather than a Netomnia one. I have a 15 response deep email thread with the build team trying to figure out how to get around this UKPN issue but that all came to an uphappy ending. </p><p>All was not lost though, based on Netomnia&apos;s data, they were going to deliver their network down the main road behind mine, and in direct line of sight of my house was a Openreach pole so I was pretty sure a Netomnia CBT would likely end up on top of it, I just had to wait. Fast forward to the 8th of January 2025 when I noticed a CBT being put on it.... it wasn&apos;t Openreach so I knew it was most likely Netomnia. Over the next few weeks I saw more works going on in the area and eventually what looked like an extension of the Netomnia spine fiber, with a man in a van splicing all these new CBTs into the core of the Netomnia network on the 28th of January. On Sunday February 2nd, I checked again and saw the CBT was live for orders.</p><p>Now onto the first thing that could have gone better... I emailed the Netomnia build team explaining the CBT was in my direct line of view, and was around 50m from my house and could they possibly serve me from that. I got a response the following day saying &quot;your address is on indefinite hold&quot;, as though they hadn&apos;t even read the email. I also live chatted YouFibre and got told &quot;we don&apos;t service your address&quot;. I wanted to give this company my money, they want to serve as many properties as possible to be able to pay off their build costs etc etc. Surely there was a way to get this thing done. Luckily, I&apos;d entered my details on the YouFibre search against the property on the road behind me... a doctors surgery and it was classed as a business address. I got a call from the business sales team on the Monday, I explained the situation and by the end of Monday they called me back saying all was good and we could move forward; by Tuesday morning the YouFibre website was telling me my address was &quot;live&quot; and ready for service (I even got a knock on the door from the local YouFibre rep because the house had gone live). Imagine getting something done that quickly with Virgin Media or BT.... but I don&apos;t understand why the business sales team needed to be the ones to push it forward... I gave the build team at Netomnia the exact same pictures explaining the situation - it should have been a simple &quot;that makes sense, lets do that&quot;</p><p>Anyway, everything got sorted Tuesday, with a survey offered for Wednesday which I couldn&apos;t do due to prior commitments so they came on Thursday instead... The survey turned into an install and by lunchtime I had a 50m drop cable from the pole installed over to my house, an ONT connected to it and the ONT connected to my Unifi Dream Machine Pro. Fantastic. As part of the business service you get a static IPv4 address instead of YouFibre&apos;s default of CGNAT and unfortunately this was initially missed from the order but a bit of comms with YouFibre over live chat and with my business manager and that was fairly quickly resolved. However I was getting odd speed test results, this was before I enabled IPv6 on the UDM Pro... I&apos;d get 2-3 Gbps down and 6-7 up against multiple speed test servers. I really didn&apos;t understand it. Then I went and enabled IPv6 on the UDM Pro and I got the full 8 up and 8 down.... Fantastic... but something was still wrong with IPV4 traffic....</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://dan-jenkins.co.uk/content/images/2025/02/Screenshot-2025-02-07-at-17.47.34.png" class="kg-image" alt="The Journey to YouFibre" loading="lazy" width="800" height="1218" srcset="https://dan-jenkins.co.uk/content/images/size/w600/2025/02/Screenshot-2025-02-07-at-17.47.34.png 600w, https://dan-jenkins.co.uk/content/images/2025/02/Screenshot-2025-02-07-at-17.47.34.png 800w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Weird IPv4 Speedtest results</span></figcaption></figure><p>A quick note of the next small disappointment, the next day, I found the remaining unused part of the drop cable in my wheelie bin. I&apos;m not sure how much there was but it wasn&apos;t nothing... I appreciate fiber probably isn&apos;t easily recycle-able (???) but surely if every install results in such waste, and if all this waste was collected... something could be done with it.... instead of it going to landfill or burned. It just seems like a missed opportunity.</p><p>I hadn&apos;t been given a YouFibre supplied router as part of the install, the Netomnia team didn&apos;t have one with them. And so I wondered if this odd speed was a me issue or not and so decided to wait until I could use their supplied router. Saturday came and here is the next thing that could have gone better. I had a 8am til 1pm window for an appointment.... 1pm came and still no engineer and no phone call to tell me of a delay. I both live chatted and called YouFibre. I was in a queue for 33 minutes on the phone before someone answered the call. I was waiting for an agent to be assigned to the live chat for 20 minutes before I closed the conversation because someone had picked up my call. Waiting 33 minutes for the phone to be answered, on a business service, isn&apos;t good enough, and waiting 20minutes for a live chat agent isn&apos;t really good enough for 24/7 service. It then took a further 40 minutes for &quot;someone will come, but I can&apos;t tell you when&quot; response. It feels like YouFibre have some things to work out in terms of support resource.</p><p>An engineer finally turned up and I eventually did some tests, I got 8 gig up and 8 gig down on both ipv4 and ipv6 being enabled... so it wasn&apos;t a network problem like I thought! A side note here.... they&apos;ve just started installing &quot;YouFibre Hub Pro&quot; routers - in the past they&apos;ve given people on the You 8000 tier an <a href="https://rog.asus.com/uk/networking/rog-rapture-gt-axe16000-model/?ref=dan-jenkins.co.uk">Asus AXE16000</a> (I think). Now they have this own branded YouFibre Hub Pro which offers 10 gig wan, 10 gig lan and 4 1 gig lan ports as well as a phone port and it looks like it supports Wifi 6E...maybe wifi 7? I&apos;m not sure. I haven&apos;t seen anything about this router online and was told by the engineer he had only installed 1 other of these.</p><figure class="kg-card kg-image-card"><img src="https://dan-jenkins.co.uk/content/images/2025/02/IMG_5442-1.jpeg" class="kg-image" alt="The Journey to YouFibre" loading="lazy" width="1000" height="1333" srcset="https://dan-jenkins.co.uk/content/images/size/w600/2025/02/IMG_5442-1.jpeg 600w, https://dan-jenkins.co.uk/content/images/2025/02/IMG_5442-1.jpeg 1000w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://dan-jenkins.co.uk/content/images/2025/02/IMG_5441-1.jpeg" class="kg-image" alt="The Journey to YouFibre" loading="lazy" width="1000" height="1333" srcset="https://dan-jenkins.co.uk/content/images/size/w600/2025/02/IMG_5441-1.jpeg 600w, https://dan-jenkins.co.uk/content/images/2025/02/IMG_5441-1.jpeg 1000w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://dan-jenkins.co.uk/content/images/2025/02/IMG_5440-1.jpeg" class="kg-image" alt="The Journey to YouFibre" loading="lazy" width="1000" height="1333" srcset="https://dan-jenkins.co.uk/content/images/size/w600/2025/02/IMG_5440-1.jpeg 600w, https://dan-jenkins.co.uk/content/images/2025/02/IMG_5440-1.jpeg 1000w" sizes="(min-width: 720px) 720px"></figure><p>When you have a static IP from YouFibre their ONT gives out this static IP via DHCP but this is limited to a 1 hour lease and so moving from one router to another is a pain. I decided to clone the YouFibre router wan MAC address to the UDM Pro to remove this headache so I didn&apos;t have to wait the hour.... I did another speed test on IPv4 expecting to still see the weird speeds and instead I got 8 gig both up and down! Whether the YouFibre engineer adding the router to my account, or using the YouFibre router&apos;s MAC address made the difference, I have no idea and I really don&apos;t want to mess with it further needlessly to find the answer.</p><figure class="kg-card kg-image-card"><img src="https://dan-jenkins.co.uk/content/images/2025/02/Screenshot-2025-02-09-at-15.31.16.png" class="kg-image" alt="The Journey to YouFibre" loading="lazy" width="800" height="1218" srcset="https://dan-jenkins.co.uk/content/images/size/w600/2025/02/Screenshot-2025-02-09-at-15.31.16.png 600w, https://dan-jenkins.co.uk/content/images/2025/02/Screenshot-2025-02-09-at-15.31.16.png 800w" sizes="(min-width: 720px) 720px"></figure><p>And if you came here looking for information on what the IPv6 prefix size is... its 56. The YouFibre router tells you this (because it all gets provisioned automatically), and random information out on the internet from other users tell you this.... but theres nothing on the YouFibre website - just a &quot;we use dhcp, just plug in&quot; in their FAQs. There are also claims on forums that business customers get a 48 sized prefix. Mine is definitely 56. It would be great to have more of this information available on their FAQs.</p><p>All in all, I used to get 1 gig down and 100 meg up with Virgin Media Business via their DOCSIS network (and a weird tunnel for static IPs), which meant 24ms pings to Cloudflare and 30ms pings to Google... now I have 8 gig up and down with 3ms pings to Cloudflare and 5ms to Google. Do I need 8 gig symmetric internet, no. Was it much more as a business to give me the flexibility to use whatever I needed up to 8 gigabit for the next 2 years, no not at all.</p>
<!--kg-card-begin: html-->
<a title="Broadband Ping" href="https://www.thinkbroadband.com/broadband/monitoring/quality/share/885ff6494a30f826cc6c2be630982cb234c3e3b9?ref=dan-jenkins.co.uk"><img alt="The Journey to YouFibre" src="https://www.thinkbroadband.com/broadband/monitoring/quality/share/885ff6494a30f826cc6c2be630982cb234c3e3b9.png"></a>

<a title="Broadband Ping" href="https://www.thinkbroadband.com/broadband/monitoring/quality/share/921ee1b67fd03d0b383705a0595708df4a9d391d?ref=dan-jenkins.co.uk"><img alt="The Journey to YouFibre" src="https://www.thinkbroadband.com/broadband/monitoring/quality/share/921ee1b67fd03d0b383705a0595708df4a9d391d.png"></a>


<!--kg-card-end: html-->
<p>If you&apos;re interested in taking up YouFibre, and this was helpful, my referral link is <a href="https://aklam.io/pElnmm?ref=dan-jenkins.co.uk">https://aklam.io/pElnmm</a> , you get some money and I get some money too.</p><p>There are some learnings from this journey with Netomnia and YouFibre and I hope they see them as constructive criticism - I&apos;ve followed them as companies for years, long before they announced plans for my nearest town (and village) and I really want them to do well. For the first time ever, I&apos;ve now got an ISP who gives me native IPv6 with proper FTTP and is future looking and that&apos;s just fantastic. They&apos;re already looking at allowing greater speeds on their network this year. Will I upgrade? Unlikely.... but its great to know you&apos;re on a network that <em>could</em> support it at not a ridiculous price that you&apos;d pay for a leased line of such a speed.</p><p>One other ask for the YouFibre team would be to install a Ubiquiti Speed test server on their network alongside their Ookla Speedtest server. It would be great to be able to test directly from the UDM Pro against a YouFibre server - more information can be found at <a href="https://community.ui.com/questions/Host-your-speedtest-server/23e8e2c8-6fdb-4e08-8016-6cea98cabfdc?ref=dan-jenkins.co.uk">https://community.ui.com/questions/Host-your-speedtest-server/23e8e2c8-6fdb-4e08-8016-6cea98cabfdc</a>, you can even make it so only YouFibre IP addresses get offered it!</p><p>Thanks to Jeremy and the YouFibre business team for getting it done! Here&apos;s to living in 2025 with FTTP internet with both native IPv4 and native IPv6.</p>]]></content:encoded></item><item><title><![CDATA[Enabling native ARM64 builds with Github Actions]]></title><description><![CDATA[<p>For a while now I&apos;ve been running Broadcaster.VC&apos;s github actions on a spare machine I have in the office. Building things for both x64 and arm64 could take some time and if I let those builds happen on Github&apos;s hosted runners I ran</p>]]></description><link>https://dan-jenkins.co.uk/enabling-native-arm64-builds-with-github-actions/</link><guid isPermaLink="false">6676e0a3bcdd230156168c51</guid><dc:creator><![CDATA[danjenkins]]></dc:creator><pubDate>Tue, 03 May 2022 14:23:32 GMT</pubDate><content:encoded><![CDATA[<p>For a while now I&apos;ve been running Broadcaster.VC&apos;s github actions on a spare machine I have in the office. Building things for both x64 and arm64 could take some time and if I let those builds happen on Github&apos;s hosted runners I ran the risk of spending money I didn&apos;t need to - I had spare compute going in the office so why not take advantage of it?</p><p>When I was running things on Github&apos;s runners - things weren&apos;t slow but they also weren&apos;t blazingly fast either and so I didn&apos;t really notice how slow things truly were. When I moved to a self hosted runner in the office, I did notice how slow things were... compiling things on an AMD Ryzen 9 5900X with 24 threads meant my X64 builds were lightning fast - taking advantage of every thread available meant those builds were super quick - I couldn&apos;t have been happier but things were a totally different story for my arm64 builds where it felt like the QEMU virtualisation Docker buildx handled for me, meant those builds were barely making any use of the 24 threads available to it.</p><p>For a while I wondered how I could make it quicker... could I add another native arm64 runner to the github actions pool? No I couldn&apos;t - because then the multi-arch docker images I was trying to create wouldn&apos;t be multi-arch at all. Ultimately the way to do it was so simple I&apos;m annoyed it took me so long to figure out. You need an arm64 host with docker installed, and for that docker server to be available from within the same network as where you&apos;re running your self hosted runner - you might choose to run an AWS Graviton host or just run a Pi4 within the same network... that&apos;s what I did. I&apos;d have preferred to run the Github runner on an M1 mac mini.... but that was an expense I didn&apos;t want to cover just yet.</p><p>So how do you do it? Like I said, first things first, install Docker on your arm64 host and then make it available over the network - you can follow the instructions in this <a href="https://gist.github.com/styblope/dc55e0ad2a9848f2cc3307d4819d819f?ref=dan-jenkins.co.uk">gist</a>. Once you&apos;ve got that... it&apos;s as simple as adding a few lines to your Github Action yaml file.</p><!--kg-card-begin: markdown--><pre><code class="language-yaml">      - name: Set up QEMU
        uses: docker/setup-qemu-action@v1

      - name: Set up Docker Buildx
        id: builder
        uses: docker/setup-buildx-action@v1

      - name: &quot;Append ARM buildx builder&quot;
        uses: baschny/append-buildx-action@v1
        with:
          builder: ${{ steps.builder.outputs.name }}
          endpoint: &quot;tcp://the-fqdn-or-ip-of-arm64-host:2375&quot;
</code></pre>
<!--kg-card-end: markdown--><p>You&apos;ve probably already got the &quot;Set up QEMU&quot; step and the &quot;Set up Docker Buildx&quot; step but there&apos;s an important addition you need &#xA0;in that step... adding an &quot;id&quot; to the buildx builder.</p><p>The next section is the important one... you want to add an arm64 host to your &quot;buildx&quot; setup so that it won&apos;t try and build arm64 builds in a virtualised builx builder. And that&apos;s how simple it is... you don&apos;t need to try and add more github action runners etc etc.... you just need to give that Docker Buildx environment the right kind of host where it can ask for a build.</p><p>The way things are setup today; I commit to Github, that asks my local linux X64 machine to build the project, and in turn, that x64 runner adds a local Pi 4 to it&apos;s docker buildx environment and both the Pi and the Ryzen machine build things in parallel - obviously the Ryzen machine finishes much earlier but the Pi4 is faster than letting the Ryzen emulate arm64 and build using QEMU - which feels crazy but it&apos;s definitely true.</p><hr><p>Now I could definitely get even faster builds than the Pi4 offers... and I did wonder if setting up a &quot;arm64&quot; emulated host using Proxmox would be even quicker than a Pi 4. But its unlikely worth the time to investigate too much further - the biggest win would be to run on dedicated arm64 hardware that had more horsepower than a Pi4 and as far as I&apos;m aware... the only thing that fits the bill would be an M1 Mac - I have an M1 mac but it&apos;s my laptop and won&apos;t always be available within the network and so a poor choice for builds.</p><p>Know a better way or a cheaper arm64 host I could use? Let me know over on twitter &#xA0;- @dan_jenkins</p><p>(Sorry for lack of links.... this was a very quick write up!)</p>]]></content:encoded></item><item><title><![CDATA[YubiKey protected SSH Keys]]></title><description><![CDATA[<p>This is something I&apos;ve been meaning to try out for a while now but never found the right moment in time to just <em>do it</em>. Just because you have access to an SSH key on a file system, why should that mean you have the right to use</p>]]></description><link>https://dan-jenkins.co.uk/yubikey-protected-ssh-keys/</link><guid isPermaLink="false">6676e0a3bcdd230156168c50</guid><dc:creator><![CDATA[danjenkins]]></dc:creator><pubDate>Tue, 22 Feb 2022 09:23:15 GMT</pubDate><content:encoded><![CDATA[<p>This is something I&apos;ve been meaning to try out for a while now but never found the right moment in time to just <em>do it</em>. Just because you have access to an SSH key on a file system, why should that mean you have the right to use it?</p><p>Firstly, if you don&apos;t know what a YubiKey is then let&apos;s catch you up. Its a small device that communicates with your computer or phone via USB/Bluetooth/NFC and enables you to handle Multi Factor Authentication. Best described as:</p><blockquote>Authentication using two or more different factors to achieve authentication. Factors include something you know (e.g., PIN, password), something you have (e.g., cryptographic identification device, token), or something you are (e.g., biometric). See authenticator.</blockquote><p>In typical MFA use, it&apos;s something you know and something you have - your password to a website and a physical device OR a software generated pin. If you haven&apos;t already started using MFA via an app that gives you codes (which is called TOTP auth) or via a physical device like a YubiKey then you should get started today. If your password to your favourite online bookshop got leaked through a data breach for example, that password would remain useless to an attacker because they don&apos;t have your second part of MFA - something you have - a yubikey or a code from an app. &#xA0;YubiKey is just a manufacturer of such devices and FIDO/U2F is an open standard which means there are other providers out there - <a href="https://solokeys.com/?ref=dan-jenkins.co.uk">https://solokeys.com/</a> for example.</p><hr><p>My YubiKey is a 5 series and so supports U2F/FIDO2 which means I can use it to protect an SSH key generated with OpenSSH. Keys which would give me access to a client&apos;s infrastructure for example, or the key that allows me to pull and push code to GitHub - these keys are extremely powerful in what they give access to but most of us just let them reside on our machines completely open to the possibility of being copied and used maliciously. So I thought it was high time I practiced what I preached and moved my SSH key over to being protected via my YubiKey.</p><p>GitHub started supporting this back in May 2021 and honestly it couldn&apos;t be easier to start using it - just follow the instructions at <a href="https://github.blog/2021-05-10-security-keys-supported-ssh-git-operations/?ref=dan-jenkins.co.uk">https://github.blog/2021-05-10-security-keys-supported-ssh-git-operations/</a> - you may need to update openssh on your system before being able to do this.</p><figure class="kg-card kg-image-card"><img src="https://dan-jenkins.co.uk/content/images/2022/02/Screen-Shot-2022-02-22-at-09.08.27.png" class="kg-image" alt loading="lazy"></figure><p>The &quot;ed25519-sk&quot; is the special bit - specifically the &quot;-sk&quot; part. </p><p>Press the button on your YubiKey device and you&apos;ll get asked where you want to save the private/public keys. I already had a key at /Users/danjenkins/.ssh/id_ed25519_sk so I changed the location for this new key - and you can see a private and public key got generated.</p><figure class="kg-card kg-image-card"><img src="https://dan-jenkins.co.uk/content/images/2022/02/Screen-Shot-2022-02-22-at-09.10.28.png" class="kg-image" alt loading="lazy"></figure><p>With the resulting public key now available to upload to a SSH server, git server or whatever knows how to accept one of these &quot;new&quot; keys.</p><figure class="kg-card kg-image-card"><img src="https://dan-jenkins.co.uk/content/images/2022/02/Screen-Shot-2022-02-22-at-09.10.48-1.png" class="kg-image" alt loading="lazy"></figure><p>Upload that public SSH key to you github account and now, whenever you do a remote operation in Git, like fetching a remote, pulling or pushing you&apos;ll be asked for a physical interaction with your YubiKey - much more secure than it ever was before!</p><figure class="kg-card kg-image-card"><img src="https://dan-jenkins.co.uk/content/images/2022/02/Screen-Shot-2022-02-21-at-10.38.57.png" class="kg-image" alt loading="lazy"></figure><p>Its more annoying than before - there have been a few times where I&apos;ve done a &quot;git push origin &lt;branchname&gt;&quot; to wonder why a build hasn&apos;t started minutes later - realising that I hadn&apos;t complied with the new requirement of interacting with my 2FA device but I know this is just a habit that I&apos;ll replace soon enough and I know that my both my business&apos; security along with the security of my clients is much better than before.</p><hr><p>PS - Did you know you can get all your public keys github knows about by using the url https://github.com/&lt;username&gt;.keys ? Next time someone asks for your ssh key just point them at your github .keys url :) Mine can be found at <a href="https://github.com/danjenkins.keys?ref=dan-jenkins.co.uk">https://github.com/danjenkins.keys</a></p><p>PPS - You can limit what kind of key is accepted by OpenSSH Server - you can read more about that at <a href="https://cryptsus.com/blog/how-to-configure-openssh-with-yubikey-security-keys-u2f-otp-authentication-ed25519-sk-ecdsa-sk-on-ubuntu-18.04.html?ref=dan-jenkins.co.uk">https://cryptsus.com/blog/how-to-configure-openssh-with-yubikey-security-keys-u2f-otp-authentication-ed25519-sk-ecdsa-sk-on-ubuntu-18.04.html </a></p><p>PPPS - Just as I was about to publish this post, Tom Lawrence from <a href="https://lawrencesystems.com/?ref=dan-jenkins.co.uk">Lawrence Systems</a> published a video talking about exactly this so go and watch his video where I&apos;m sure he explains things much better - <a href="https://www.youtube.com/watch?v=PjDFk8xdtGw&amp;ref=dan-jenkins.co.uk">https://www.youtube.com/watch?v=PjDFk8xdtGw</a> </p><figure class="kg-card kg-embed-card"><iframe width="200" height="113" src="https://www.youtube.com/embed/PjDFk8xdtGw?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></figure>]]></content:encoded></item><item><title><![CDATA[The Met's idea of Authentication]]></title><description><![CDATA[<p>Let me first just show you the tweet from the Met Police about their solution to plain clothed officer authentication following the killing of Sarah Everard by a Met police officer.</p><!--kg-card-begin: html--><blockquote class="twitter-tweet" data-dnt="true"><p lang="en" dir="ltr">Any lone, plain clothed police officer who engages with a woman on her own will now verify their identity</p></blockquote>]]></description><link>https://dan-jenkins.co.uk/the-mets-idea-of-authentication/</link><guid isPermaLink="false">6676e0a3bcdd230156168c4f</guid><dc:creator><![CDATA[danjenkins]]></dc:creator><pubDate>Thu, 21 Oct 2021 12:25:40 GMT</pubDate><content:encoded><![CDATA[<p>Let me first just show you the tweet from the Met Police about their solution to plain clothed officer authentication following the killing of Sarah Everard by a Met police officer.</p><!--kg-card-begin: html--><blockquote class="twitter-tweet" data-dnt="true"><p lang="en" dir="ltr">Any lone, plain clothed police officer who engages with a woman on her own will now verify their identity through a new process. <br><br>We know we need to regain women&#x2019;s trust. <br><br>We fully accept the onus is on us to verify we are who we say we are &amp; that we are acting appropriately.</p>&#x2014; Metropolitan Police (@metpoliceuk) <a href="https://twitter.com/metpoliceuk/status/1450848637194547205?ref_src=twsrc%5Etfw&amp;ref=dan-jenkins.co.uk">October 20, 2021</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script><!--kg-card-end: html--><p>Let&apos;s not take anything away from the reason this is needed in the first place, a horrific crime by someone in a position of power, within one of the world&apos;s most well known Police forces against a lone female. But the presented solution from the Met seems just a little odd. Let&apos;s take a look at what they are proposing...</p><p>Allowing women who are stopped by a plain clothes police officer to ask for verification that the person standing in-front of them is indeed a serving police officer, who is lawfully allowed to stop them. They would do this authentication using the police officer&apos;s mobile phone, video calling into the control room for a uniformed officer to confirm things.</p><h3 id="but-what-are-the-issues-with-that">But what are the issues with that?</h3><p>a) why is this option only open to women? There are many other people who could benefit from such an authentication solution, and not just for protecting against kidnapping - it should be open to anyone wishing to authenticate the plain clothed officer.<br>b) the officer in question uses their phone, to video call the &quot;control room&quot; with a uniformed officer able to corroborate that the officer is acting lawfully.<br>&#x2003;b1) why should the person involved trust this uniformed officer? It could literally be someone dressed up as a police officer.<br>&#x2003;b2) how are these video calls going to take place? WhatsApp? MS Teams?<br>&#x2003;b3) How are these video calls going to get prioritised so they don&apos;t go unanswered?<br>c) The officer is in control on this situation; not the person who wants to check authenticity.<br>d) 2FA requires us to prove multiple things with known entities and a video call over an unknown medium, to an unknown user on the other end leads to lack of trust in the &quot;solution&quot;</p><hr><h2 id="so-what-can-we-do-instead">So what can we do instead? </h2><p>Ultimately I believe this to be a simple problem to solve using existing technology available to all of the Police forces across the UK - standard 2FA practices.</p><p>It all starts with the person who&apos;s been stopped being able to call a number outside of the control of the police officer standing in-front of them - maybe a new 3 digit code like we have for 999, 101, 111 and 119. This would be done from the person&apos;s phone - they&apos;re in control of the call which removes any suspicion around whether the call is legitimate or not. This would be answered by an automated system asking for the Officer&apos;s warrant card number - these are unique, every officer has one. If the phone system didn&apos;t understand or couldn&apos;t find the warrant card number then the call would get redirected over to the police 999 centre automatically.</p><p>If the warrant card number was found then this automated phone system would go and interrogate the relevant systems to find out if the Officer was on duty or not - could there be other checks done here? I don&apos;t know... but this would be a good starting point. If they were on duty an automated phone call would get initiated to their registered phone number (the Met are using phones for their solution so officers must all have one?).</p><p>The Officer would then answer their phone and be asked for an auth code - having a static auth code would be a bad idea - everyone can be socially engineered and private details stolen along with their mobile phone in order to impersonate a police officer. Instead why not make use of TOTP auth codes from authenticator apps - the Police must already be using some form of this for accessing internal systems.</p><p>So the Police Officer has an active call from the automated system, and puts in their auth code from a TOTP app and a message is played back to the person making the original call - the officer is indeed an active on duty police officer. If the wrong code was entered, again the call would be transferred over to the police 999 centre.</p><h2 id="what-if-the-officer-is-off-duty">What if the officer is off duty?</h2><p>This is the scenario Sarah Everard found herself in. The officer involved was off duty - although she didn&apos;t know that. What would happen here?</p><p>If the system came back and said the officer was off duty then the call would again immediately be transferred to the police 999 centre to be dealt with. If a call disconnected from this line without getting to the end with a successful auth then this would get reported to the police also and of course the whole call would be recorded to be used later if necessary.</p><hr><p>The process is a simple one and ultimately could be implemented extremely quickly - there would be setup in terms of having a TOTP code available on Officer&apos;s phones as well as linking every Officer with a phone number. And what would happen if the officer had zero phone signal - well then the Met&apos;s idea wouldn&apos;t work either.</p><p>Ultimately relying on video calls to an unknown entity using goodness knows what service is a terrible terrible idea. </p><p>Trusted 2FA is a mixture of:</p><ul><li>Something you know -- such as a password or pin number</li><li>Something you have -- such as a phone, token or other digital device</li><li>Something you are -- something unique to your physical being -- biometrics-- like a fingerprint, palm print, retina scan, or your GPS location (to verify you are logging in from the correct area)</li></ul><p>We probably can&apos;t do the last one... but the mixture of the warrant card number and an auth code either from an app or hardware token seems like a sensible route to take. Using Open Source VoIP projects such as Asterisk or Drachtio would make implementing this something that could be done very quickly - I imagine the &quot;is the officer on duty&quot; check would actually take longer to implement.</p><p>Side note: I thought calls to 999 automatically shared your GPS data with operators but it seems that&apos;s only the case on Android and not on iOS and so GPS data couldn&apos;t be used in being able to send a response team in those failure scenarios. Maybe part of the initial call setup would ask for a street name etc. If the officer involved was indeed not acting lawfully then would they even allow the person to get this far into the authentication process anyway? Unlikely? </p><h2 id="is-something-better-than-nothing">Is something better than nothing?</h2><p>I get that the police are trying to build back confidence and so need to present a solution in order to be able to do that. But is something better than nothing in this case? I&apos;m really not sure. There is just too little control given to the person being stopped by the police and all of the control given to the officer involved - no change to the scenario Sarah Everard found herself in unfortunately.</p>]]></content:encoded></item><item><title><![CDATA[Announcing Broadcaster.VC]]></title><description><![CDATA[<p>Running <a href="https://2020.commcon.xyz/?ref=dan-jenkins.co.uk">CommCon Virtual</a> back in July was a real eye opener for me. I&apos;d always been interested in the technical side of running events but I went to University, got into Coding and then started<a href="https://nimblea.pe/?ref=dan-jenkins.co.uk"> Nimble Ape</a> - a RTC Consultancy specialising in Open Source WebRTC/VoIP and</p>]]></description><link>https://dan-jenkins.co.uk/announcing-broadcaster-vc/</link><guid isPermaLink="false">6676e0a3bcdd230156168c4e</guid><category><![CDATA[broadcasting]]></category><category><![CDATA[webrtc]]></category><category><![CDATA[janus]]></category><dc:creator><![CDATA[danjenkins]]></dc:creator><pubDate>Thu, 26 Nov 2020 16:07:57 GMT</pubDate><media:content url="https://dan-jenkins.co.uk/content/images/2020/11/broadcaster-logo-R2-1.svg" medium="image"/><content:encoded><![CDATA[<img src="https://dan-jenkins.co.uk/content/images/2020/11/broadcaster-logo-R2-1.svg" alt="Announcing Broadcaster.VC"><p>Running <a href="https://2020.commcon.xyz/?ref=dan-jenkins.co.uk">CommCon Virtual</a> back in July was a real eye opener for me. I&apos;d always been interested in the technical side of running events but I went to University, got into Coding and then started<a href="https://nimblea.pe/?ref=dan-jenkins.co.uk"> Nimble Ape</a> - a RTC Consultancy specialising in Open Source WebRTC/VoIP and Web projects. But CommCon brought it all back - suddenly I was in the thick of &quot;how to do remote participation at an event&quot;.</p><p>The way we did all the pre-production recordings for CommCon Virtual was with vMix using their WebRTC addition; well I say we... the AV company <a href="https://pspav.com/?ref=dan-jenkins.co.uk">PSP</a> did all the hard work. But even though we were using WebRTC to record the sessions for an RTC conference, something still tasted awful about the experience - how vMix does all the mixing - acting as an MCU which inherently means theres a delay in natural conversation between multiple people.</p><p>As the Pandemic continued, we&apos;ve seen more and more conferences move online, along with national broadcasters seemingly open to publicizing a publicly traded company. All this talk of Zoom on national TV just didn&apos;t sit right with me and so I started thinking about what we in the RTC community could do about it all. Back in July my friends at Meetecho wrote about <a href="https://www.meetecho.com/blog/webrtc-ndi/?ref=dan-jenkins.co.uk">their NDI plugin for Janus</a> and we got to work on improving it&apos;s capabilities as well as thinking about how to enable people to use it. Ultimately a demo became more of a platform and <a href="https://broadcaster.vc/?ref=dan-jenkins.co.uk">Broadcaster.vc</a> was born.</p><figure class="kg-card kg-image-card"><img src="https://dan-jenkins.co.uk/content/images/2020/11/broadcaster-logo-R2.svg" class="kg-image" alt="Announcing Broadcaster.VC" loading="lazy"></figure><hr><p>So what is Broadcaster.vc? Ultimately it&apos;s a conferencing platform with Janus at it&apos;s heart and it&apos;s toes. This means you subscribe to Broadcaster.vc and use it as a meeting space for your participants who are involved in the publication of media - they get to talk to one another (or not if you choose that route) with millisecond latency, but with NDI feeds available for every single participant in a &quot;room&quot;. We do that by running a docker image inside your network that magically connects to all of your &quot;rooms&quot; in the cloud and produces an NDI feed for every participant - no mixed audio like you get from Microsoft Teams or Skype, and video at the resolution that the sender is sending it at; meaning one of your publishers could be sending 480 and another could be sending 1080. But that&apos;s not all - you&apos;re not limited to one camera and one mic, you can screen-share and add in more cameras, and more mics as you see fit - they&apos;ll all just turn up as NDI feeds in whatever platform you&apos;re using.</p><figure class="kg-card kg-image-card"><img src="https://dan-jenkins.co.uk/content/images/2020/11/Screen-Shot-2020-11-26-at-16.54.07.jpg" class="kg-image" alt="Announcing Broadcaster.VC" loading="lazy"></figure><figure class="kg-card kg-image-card"><img src="https://dan-jenkins.co.uk/content/images/2020/11/Screen-Shot-2020-11-26-at-16.54.50.jpg" class="kg-image" alt="Announcing Broadcaster.VC" loading="lazy"></figure><p>Now there&apos;s still work to do, but the core of the platform is now built and I wanted to see what kind of response there would be to it, what interest it gathers so here it is. Interested in knowing more? Send me an email at <a href="mailto:dan@nimblea.pe">dan@nimblea.pe</a></p><p>There&apos;s so much more to be said about the possibilities it&apos;s ridiculous - you can force your room to use the VP9 codec if you want to in order to get that better picture quality with stereo channels for production audio vs backstage audio as well as being able to selectively send specific published feeds to specific other users - imagine creating a special output mix from OBS/vMix and having that sent back to one of 5 of your publishers - the sky&apos;s the limit in what we can achieve with a platform entirely designed around Broadcasting media, rather than trying to make a conferencing service like Zoom/Teams/Skype/Whatever else fit the flows needed for running events.</p><p>We&apos;re excited about the possibilities. Let us know what you think! We&apos;ll hopefully open up registrations in the next week or so with some kind of subscription plan; but let us know if you&apos;re interested in what the platform could do for you - get in touch if you&apos;d like a demo.</p><p>UPDATE - The self service demo/beta of the service is now available at <a href="https://broadcaster.vc/?ref=dan-jenkins.co.uk">https://broadcaster.vc</a> with all feedback welcomed at <a href="mailto:broadcaster@nimblea.pe">broadcaster@nimblea.pe</a></p>]]></content:encoded></item><item><title><![CDATA[Remote Working: Conferencing Services]]></title><description><![CDATA[<p>Today&apos;s post was going to be about SIP over TLS and Secure media but I decided this was more important to release as soon as I could. With more and more countries locking down their borders and encouraging their citizens not to meet in person we&apos;re</p>]]></description><link>https://dan-jenkins.co.uk/remote-working-conferencing-services/</link><guid isPermaLink="false">6676e0a3bcdd230156168c4d</guid><dc:creator><![CDATA[danjenkins]]></dc:creator><pubDate>Tue, 17 Mar 2020 09:13:14 GMT</pubDate><content:encoded><![CDATA[<p>Today&apos;s post was going to be about SIP over TLS and Secure media but I decided this was more important to release as soon as I could. With more and more countries locking down their borders and encouraging their citizens not to meet in person we&apos;re seeing a huge rise in popularity of conferencing services but what options do you have to stay connected with friends, colleagues, teachers and family?</p><p>WebRTC apps drive communication around the world and now more than ever before, we need freely available communications services. So here are some of the ones available to you to go and use without any setup, as well as some others you could go and setup yourselves for truly secure communication.</p><h2 id="hosted">Hosted</h2><figure class="kg-card kg-bookmark-card kg-card-hascaption"><a class="kg-bookmark-container" href="https://meet.jit.si/?ref=dan-jenkins.co.uk"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Jitsi Meet</div><div class="kg-bookmark-description">Join a WebRTC video conference powered by the Jitsi Videobridge</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://meet.jit.si/images/apple-touch-icon.png" alt></div></div><div class="kg-bookmark-thumbnail"><img src="https://meet.jit.si/images/jitsilogo.png?v=1" alt></div></a><figcaption>Jitsi Video Bridge</figcaption></figure><figure class="kg-card kg-bookmark-card kg-card-hascaption"><a class="kg-bookmark-container" href="https://whereby.com/user?ref=dan-jenkins.co.uk"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Whereby</div><div class="kg-bookmark-description">Easy video meetings with no login or downloads. Video conferencing with screen sharing, recording and much more.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://d32wid4gq0d4kh.cloudfront.net/favicon_whereby-196x196.png" alt><span class="kg-bookmark-author">whereby.com</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://d2qulvgqu65efe.cloudfront.net/facebook-opengraph.png" alt></div></a><figcaption>Whereby - the new name for appear.in</figcaption></figure><figure class="kg-card kg-bookmark-card kg-card-hascaption"><a class="kg-bookmark-container" href="http://sip2sip.info/?ref=dan-jenkins.co.uk"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Free Internet communications</div><div class="kg-bookmark-description">SIP2SIP is a real time communications service for audio, video, presence, chat, file transfers and multi-party conferencing.</div><div class="kg-bookmark-metadata"><span class="kg-bookmark-author">SIP2SIP</span><span class="kg-bookmark-publisher">AG Projects</span></div></div><div class="kg-bookmark-thumbnail"><img src="http://sip2sip.info/images/opengraph/logo.png" alt></div></a><figcaption>Sylk</figcaption></figure><p>Of course there are others that require signing up, a subscription or even downloading an app: <a href="https://houseparty.com/?ref=dan-jenkins.co.uk">Houseparty</a>, Skype, Google Meet, Zoom, <a href="https://matrix.org/?ref=dan-jenkins.co.uk">Matrix</a>, Facetime and Google Duo are just a few of the available services.</p><p>Edit: Whereby requires one of the users to have an account - I&apos;ve had my account for so long I hadn&apos;t realised.</p><p>However, if you can, why not remove the burden on the above services and run something yourself? The 8x8/Jitsi team have increased their available instances dramatically just to help with availability - I can&apos;t even comprehend how much effort it&apos;s taking to keep all of that running the way they are.</p><figure class="kg-card kg-embed-card"><blockquote class="twitter-tweet"><p lang="en" dir="ltr">Well, that didn&#x2019;t cut it, so running with 50x the resources now. 500+ Jitsi Videobridges at your service Italy! All systems green! &#x1F680;&#x1F680;&#x1F680; <a href="https://t.co/zQ5TodoQ8E?ref=dan-jenkins.co.uk">https://t.co/zQ5TodoQ8E</a></p>&#x2014; Jitsi (@jitsinews) <a href="https://twitter.com/jitsinews/status/1238034751136686080?ref_src=twsrc%5Etfw&amp;ref=dan-jenkins.co.uk">March 12, 2020</a></blockquote>
<script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
</figure><p>If you really want to use Jitsi, there are also community run instances all over the globe! You can find them on the <a href="https://github.com/jitsi/jitsi-meet/wiki/Jitsi-Meet-Instances?ref=dan-jenkins.co.uk">Jitsi Meet Wiki</a></p><h2 id="self-hosted">Self Hosted</h2><p>So if you have the knowhow and can run something yourself, which projects could you go and run?</p><figure class="kg-card kg-bookmark-card kg-card-hascaption"><a class="kg-bookmark-container" href="https://jitsi.org/jitsi-videobridge/?ref=dan-jenkins.co.uk"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Jitsi Videobridge - scalable open source video conferencing for developers</div><div class="kg-bookmark-description">With Jitsi Videobridge, you can build massively scalable multiparty video applications. Start building your free open-source videoconferencing today.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://335wvf48o1332cksy23mw1pj-wpengine.netdna-ssl.com/wp-content/uploads/2017/01/cropped-jitsi-512x512-192x192.png" alt><span class="kg-bookmark-author">Jitsi</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://jitsi.org/wp-content/uploads/2017/06/jitsi-front.png" alt></div></a><figcaption>Jitsi Video Bridge</figcaption></figure><figure class="kg-card kg-bookmark-card kg-card-hascaption"><a class="kg-bookmark-container" href="https://github.com/nimbleape/dana-the-stream-gatekeeper?ref=dan-jenkins.co.uk"><div class="kg-bookmark-content"><div class="kg-bookmark-title">nimbleape/dana-the-stream-gatekeeper</div><div class="kg-bookmark-description">React based front-end demo for Asterisk&#x2019;s SFU. Contribute to nimbleape/dana-the-stream-gatekeeper development by creating an account on GitHub.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://github.githubassets.com/favicon.ico" alt><span class="kg-bookmark-author">GitHub</span><span class="kg-bookmark-publisher">nimbleape</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://avatars3.githubusercontent.com/u/7715457?s=400&amp;v=4" alt></div></a><figcaption>The Dana project from Nimble Ape teamed up with Asterisk</figcaption></figure><figure class="kg-card kg-bookmark-card kg-card-hascaption"><a class="kg-bookmark-container" href="https://github.com/versatica/mediasoup-demo?ref=dan-jenkins.co.uk"><div class="kg-bookmark-content"><div class="kg-bookmark-title">versatica/mediasoup-demo</div><div class="kg-bookmark-description">mediasoup official demo application. Contribute to versatica/mediasoup-demo development by creating an account on GitHub.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://github.githubassets.com/favicon.ico" alt><span class="kg-bookmark-author">GitHub</span><span class="kg-bookmark-publisher">versatica</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://avatars0.githubusercontent.com/u/1941045?s=400&amp;v=4" alt></div></a><figcaption>MediaSoup</figcaption></figure><p>If you&apos;ve got a project which is fairly easy to get setup and it&apos;s not in the list - send me a tweet at <a href="https://twitter.com/dan_jenkins?ref=dan-jenkins.co.uk">@dan_jenkins</a> and I&apos;ll get it added. If you read this and you&apos;re not sure how to get started - feel free to tweet me, I&apos;ll do all I can to help push you in the right direction!</p><p>Be kind to one another, go and connect with friends, family and co-workers. Keep the economies moving but remain safe while doing so.</p>]]></content:encoded></item><item><title><![CDATA[Remote Working: Get rid of your VPN and use an SBC]]></title><description><![CDATA[<p>This is the first of a series of blog posts this week about remote working - come back each day for something new!</p><p>Last week Chris from Crosstalk Solutions recorded a video titled &quot;<a href="https://www.youtube.com/watch?v=MoSlRO2E8q4&amp;ref=dan-jenkins.co.uk">Telecommunications 101</a>&quot; on his YouTube channel. If you haven&apos;t watched it already, it&</p>]]></description><link>https://dan-jenkins.co.uk/remote-working-get-rid-of-your-vpn-and-use-an-sbc/</link><guid isPermaLink="false">6676e0a3bcdd230156168c4b</guid><category><![CDATA[sbc]]></category><category><![CDATA[open source]]></category><category><![CDATA[kamailio]]></category><category><![CDATA[opensips]]></category><category><![CDATA[rtpengine]]></category><category><![CDATA[rtpproxy]]></category><category><![CDATA[telcobridges]]></category><dc:creator><![CDATA[danjenkins]]></dc:creator><pubDate>Mon, 16 Mar 2020 10:37:14 GMT</pubDate><media:content url="https://dan-jenkins.co.uk/content/images/2020/03/castle-chain-padlock-security-protect-fence.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://dan-jenkins.co.uk/content/images/2020/03/castle-chain-padlock-security-protect-fence.jpg" alt="Remote Working: Get rid of your VPN and use an SBC"><p>This is the first of a series of blog posts this week about remote working - come back each day for something new!</p><p>Last week Chris from Crosstalk Solutions recorded a video titled &quot;<a href="https://www.youtube.com/watch?v=MoSlRO2E8q4&amp;ref=dan-jenkins.co.uk">Telecommunications 101</a>&quot; on his YouTube channel. If you haven&apos;t watched it already, it&apos;s a good watch for those coming to remote working in light of COVID-19 - primarily being targeted at IT Admins who will currently be inundated with requests to enable access to internal systems for remote workers. His YouTube channel is full of fantastic content so it&apos;s worth subscribing.</p><p>The main TLDR of the video that I took away with me was about enabling VPN for VoIP calling and how some phones even support VPN built into them to enable security as well as access to PBXs that live behind firewalls. Now, while I don&apos;t disagree that VPNs have a place and that not everything can be available publicly &#xA0;on the internet in all the different industries these technologies are used; I do think it&apos;s high time that businesses started to look at whether phone systems should be openly available to all those who need them in todays society of remote working wherever you are in the world. And it doesn&apos;t have to cost the earth either.</p><hr><h2 id="open-source-vs-commercial">Open Source vs Commercial</h2><p>The simplest solution here is what we call an SBC - a <a href="https://en.wikipedia.org/wiki/Session_border_controller?ref=dan-jenkins.co.uk">Session Border Controller</a>; and we can either make our own SBC using Open Source software or buy one in from a commercial entity. What is an SBC? In short, its an entrypoint of SIP traffic onto your network - whether thats SIP over UDP, TCP or TLS as well as the associated media and ultimately deals with allowing traffic or not into your internal network - a firewall of sorts for SIP. Now, I don&apos;t have much experience with commercial SBCs other than I know they exist from the major players as well as some players you might now have heard of like <a href="https://www.telcobridges.com/?ref=dan-jenkins.co.uk">Telcobridges</a> with their <a href="https://freesbc.telcobridges.com/?ref=dan-jenkins.co.uk">ProSBC</a> product. Do your research, or if you don&apos;t have time to do your own research I can put you in touch with the right experts who can take care of everything for you.</p><h2 id="open-source-is-king">Open Source is King</h2><p>Instead of talking commercial solutions, &#xA0;I wanted to talk about using the Open Source tools available to you today which allow you to open up your business PBX to the outside world; primarily <a href="https://www.kamailio.org/w/?ref=dan-jenkins.co.uk">Kamailio</a> and <a href="https://github.com/sipwise/rtpengine?ref=dan-jenkins.co.uk">RTPEngine</a>. For a long time, VoIP phone systems have been kept off the public internet through fear of the unknown - there are these bad people who want to abuse your phone system and hammer it looking for vulnerabilities and you hear all about them at VoIP conferences where you hear stories of thousands of dollars worth of fraud and you decided you just wanted to take the easy route of keeping the phone system behind a firewall.</p><p>I say, its high time businesses stopped being fearful and got on with enabling employees to work freely from wherever they are in the world. There are loads of steps you can take to do that. The simplest might sound like opening up your PBX to the internet and blocking traffic using tools like <a href="https://apiban.org/?ref=dan-jenkins.co.uk">APIBAN.org </a>(an excellent tool from my friends at LOD) but going from a system where there is no complexity of &quot;public internet&quot; to one now having to deal with it is harder than you think.</p><p>In comes the SBC. Some may say Kamailio isn&apos;t an SBC - but I believe when teamed up with RTPEngine it is a very capable and flexible one. An SBC is a really good answer to this particular problem for many reasons - you&apos;re still protecting your PBX behind a firewall and you&apos;re not changing how it interacts with your existing phones etc. You&apos;re just enabling those users that need remote access to be able to get it without altering your current system. Unfortunately there&apos;s no plug and play Kamailio and RTPEngine system available out there on the internet for you to download and start using within an hour. A super flexible solution but one that takes some time and probably someone that truly understands how to make it work - people such as <a href="https://www.fredposner.com/?ref=dan-jenkins.co.uk">Fred</a>, <a href="https://www.kamailio.org/w/daniel-constantin-mierla/?ref=dan-jenkins.co.uk">Daniel</a> and <a href="https://skalatan.de/en/about?ref=dan-jenkins.co.uk">Henning</a> from the Kamailio project are your best port of call for a secure, flexible SBC.</p><p>If your particular choice of OpenSER fork is OpenSIPS then you can do the same with <a href="https://www.opensips.org/?ref=dan-jenkins.co.uk">OpenSIPS</a> and <a href="https://www.rtpproxy.org/?ref=dan-jenkins.co.uk">RTPProxy</a></p><h2 id="covid-19">COVID-19</h2><p>But we need a solution today Dan I hear you cry. Then maybe your best route would be Telcobridges&apos; FreeSBC/ProSBC system - I have huge faith in the Telcobridges technical team. ProSBC is $1/session/server/year and includes media instead of just SIP which I would say is a must. If you don&apos;t want your PBX open to the internet, keep it hidden and let something else take the battering of intrusion attempts. Of course you could firewall off large parts of the world to reduce the battering of inbound traffic but that one time one of your employees is in Russia for a legitimate reason, and can&apos;t make/receive phone calls - you&apos;ll have long forgotten about that firewall rule in place won the SBC. We live in a remote age of working - allow your users/employees to indeed be remote.</p><p>Now, another reason for using a VPN would be &quot;my media and signaling is now encrypted by the VPN tunnel&quot; which is correct but in my opinion an unnecessary overhead. Tomorrow I&apos;ll talk about encrypting your SIP Signaling as well as your media so that no-one can listen in on those important conversations.</p><hr><p>Nimble Ape can help you with your needs in this space; and if we can&apos;t we&apos;ll point you in the right direction of a trusted consultant who does. Thanks to Chris from <a href="https://crosstalksolutions.com/?ref=dan-jenkins.co.uk">Crosstalk Solutions</a> for all he does on his YouTube channel.</p>]]></content:encoded></item><item><title><![CDATA[Github Sponsorships]]></title><description><![CDATA[<p>I finally got word this morning that the Nimble Ape GitHub org can accept GitHub sponsorships - which is excellent news. Don&apos;t know about GitHub sponsors? Have a read over at <a href="https://github.com/sponsors?ref=dan-jenkins.co.uk">https://github.com/sponsors</a>.</p><p>GitHub has enabled that &quot;Sponsor&quot; button on repos for a while</p>]]></description><link>https://dan-jenkins.co.uk/github-sponsorships/</link><guid isPermaLink="false">6676e0a3bcdd230156168c4a</guid><category><![CDATA[github]]></category><dc:creator><![CDATA[danjenkins]]></dc:creator><pubDate>Fri, 13 Mar 2020 13:45:28 GMT</pubDate><media:content url="https://dan-jenkins.co.uk/content/images/2020/03/mona-heart-featured.png" medium="image"/><content:encoded><![CDATA[<img src="https://dan-jenkins.co.uk/content/images/2020/03/mona-heart-featured.png" alt="Github Sponsorships"><p>I finally got word this morning that the Nimble Ape GitHub org can accept GitHub sponsorships - which is excellent news. Don&apos;t know about GitHub sponsors? Have a read over at <a href="https://github.com/sponsors?ref=dan-jenkins.co.uk">https://github.com/sponsors</a>.</p><p>GitHub has enabled that &quot;Sponsor&quot; button on repos for a while now with essentially links to third parties such as Patreon and Tidelift - that&apos;s all well and good but nothing&apos;s easier than not needing to sign up for another account and having to explain a non GitHub purchase with your accounting department.</p><p>With GitHub sponsors you can set tiers of sponsorship which are collected monthly - just like other platforms. You can see Nimble Ape&apos;s tiers at <a href="https://github.com/sponsors/nimbleape?ref=dan-jenkins.co.uk">https://github.com/sponsors/nimbleape</a></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://dan-jenkins.co.uk/content/images/2020/03/Screen-Shot-2020-03-13-at-13.33.58.png" class="kg-image" alt="Github Sponsorships" loading="lazy"><figcaption>Nimble Ape&apos;s GitHub sponsorship tiers</figcaption></figure><hr><p>Now, do I truly think people are going to subscribe to give Nimble Ape $5 a month to say thanks for some code we open-sourced? Probably not. Do I think it&apos;s possible that some of the companies that use code that we open-sourced would pay us $100 a month for some continual improvements to the code they use? Absolutely - $100 is nothing for a company - especially if they&apos;re already paying for GiHub licenses etc etc.</p><p>If the options not there, people will never just give you money - make it easy for people to &quot;reward&quot; you for your Open Source code and they might just do that! Fancy supporting Nimble Ape with some cash every month? Go hit one of those buttons at <a href="https://github.com/sponsors/nimbleape?ref=dan-jenkins.co.uk">https://github.com/sponsors/nimbleape</a></p>]]></content:encoded></item><item><title><![CDATA[Powering my House & EV]]></title><description><![CDATA[<p>I thought it was about to time I wrote something non work related on here. I often get asked by colleagues/friends about how I power my home. They know I have a Tesla Model S, they know I have Solar panels and they know I have &quot;home batteries&</p>]]></description><link>https://dan-jenkins.co.uk/powering-my-house-ev/</link><guid isPermaLink="false">6676e0a3bcdd230156168c49</guid><category><![CDATA[powerwall]]></category><category><![CDATA[solar]]></category><category><![CDATA[tesla]]></category><category><![CDATA[ev]]></category><dc:creator><![CDATA[danjenkins]]></dc:creator><pubDate>Wed, 15 Jan 2020 14:35:14 GMT</pubDate><media:content url="https://dan-jenkins.co.uk/content/images/2020/01/IMG_20190626_103938.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://dan-jenkins.co.uk/content/images/2020/01/IMG_20190626_103938.jpg" alt="Powering my House &amp; EV"><p>I thought it was about to time I wrote something non work related on here. I often get asked by colleagues/friends about how I power my home. They know I have a Tesla Model S, they know I have Solar panels and they know I have &quot;home batteries&quot; because I like to draw as little energy from the national grid as possible. I almost gave a lightning talk at <a href="https://2019.commcon.xyz/?ref=dan-jenkins.co.uk">CommCon 2019</a> about this topic but time was running short as it was so here&apos;s what would have gone into the lightning talk. I guess whether you find it interesting or not is whether you&apos;re interested in reducing your carbon footprint going forward.</p><h3 id="a-quick-tldr-">A Quick TLDR;</h3><p>Ultimately the energy we use from the National Grid is greener, the energy we use in the summer is mostly self generated and I love driving an EV.</p><h2 id="the-goal">The Goal</h2><p>The goal isn&apos;t really to draw as little from the grid as possible, it&apos;s to draw as little from the grid at peak times as possible most of all.</p><h2 id="a-bit-of-background">A bit of background</h2><p>It all started when we moved into our current house; we had some money left over from the sale/purchase and wanted to reduce our electricity bills; I work from home and have computers running 24/7, many smart devices etc etc etc - all drawing power constantly - we have a base load of around 1kw/h. It wasn&apos;t really anything to do with &quot;going green&quot; at all - we had some money, spent it on reducing our costs every month and in the process got a cheque for our Solar generation/export every 4 months. Nice right?</p><p>Fast forward a few years and I got the Tesla Model S, again 100% driven by the fact I wanted a company car (not a personal one) and the only way to do that without spending a lot on personal tax to HMRC was to get electric; the only car that matched long distance and that extra bit of luxury was a Tesla. But this is where I got into trying to shift usage of the National Grid to times where it&apos;s not under high demand and therefore electric is cheaper to consume.</p><h2 id="what-gadgets-do-i-have">What gadgets do I have?</h2><!--kg-card-begin: markdown--><ul>
<li><a href="https://www.tesla.com/en_gb/models?ref=dan-jenkins.co.uk">Tesla Model S - 100D</a></li>
<li><a href="https://www.tesla.com/en_GB/energy?ref=dan-jenkins.co.uk">2x Tesla Powerwalls &amp; Tesla Gateway</a></li>
<li>8kw of Solar Panels (pointed in two different directions)</li>
<li><a href="https://myenergi.com/?ref=dan-jenkins.co.uk">MyEnergi Zappi EVSE (charger for the car), MyEnergi Hub</a></li>
</ul>
<!--kg-card-end: markdown--><p>Tesla are by far out in front when it comes to electric cars and their home energy products - no-one comes close to the Powerwall&apos;s energy density for the price. Inside each Powerwall you get 13.5kWh of storage - we have two so that&apos;s 27kWh of storage which works out well when you have around a 1kw/h base-load in the house with occasional heaters going on and off in my outdoor office.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://dan-jenkins.co.uk/content/images/2020/01/IMG_20190625_182407.jpg" class="kg-image" alt="Powering my House &amp; EV" loading="lazy"><figcaption>2x Powerwalls in a &quot;Stack&quot;</figcaption></figure><p>The MyEnergi Zappi and Hub allow us to charge the car using excess Solar during the day. Now that we have two Powerwalls this happens infrequently but when we only had the one (or back when we didn&apos;t have a Powerwall at all), it could mean any excess solar during the day could be pumped into the car instead of going back to the grid - it essentially monitors the power going back to the grid, once it gets to the right level for the car, it starts sending that exact amount to the car. Most chargers can&apos;t do this variable amount of current. Imagine having the power to put &quot;free&quot; miles into your car just by the power of the sun - pretty damn cool.</p><figure class="kg-card kg-image-card"><img src="https://dan-jenkins.co.uk/content/images/2020/01/IMG_20180606_113938_1.jpg" class="kg-image" alt="Powering my House &amp; EV" loading="lazy"></figure><h2 id="the-how">The How</h2><p>This is better described by a picture. It&apos;s definitely a non-simple setup; but I wouldn&apos;t call it a complex one either. </p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://dan-jenkins.co.uk/content/images/2020/01/house.png" class="kg-image" alt="Powering my House &amp; EV" loading="lazy"><figcaption>OK, it looked better in my head.</figcaption></figure><p>Did you know electricity can flow in either direction within all of your circuits inside your home? That&apos;s how this all works. Powerwalls consume energy as well as push it out to keep up with what the house is trying to consume from the grid. The Grid can cope with us asking for energy as well as us pushing it back out to them for someone else to use. Solar panel inverters (they take the DC the panels generate and convert it into AC which is what we use in our homes) only generate electricity, just as the Model S can&apos;t push energy back into the house because the inverters in the car (yes, the energy is stored as DC in the car too) can&apos;t chuck that energy back into AC (well maybe they can in something called Vehicle To Grid charging but thats a completely separate topic.)</p><p>TLDR; We signed up to an energy supplier who could give us cheap electric at night to charge the Powerwall (and now the Powerwalls) as well as the car and we run off that stored electric all day. The image below is a pretty standard day where the Powerwalls are almost empty and we need to top the car up.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://dan-jenkins.co.uk/content/images/2020/01/energy.png" class="kg-image" alt="Powering my House &amp; EV" loading="lazy"><figcaption>Energy drawn on the 6th January 2020</figcaption></figure><p>That&apos;s 55kWh of energy at 5p per kw - resulting in &#xA3;2.75 to run the house and the car that day. This is in the height of winter with almost no Solar generation the day before (3.2kWh) to help top up the Powerwalls. Some may say that&apos;s still pretty expensive to run your house all day but if we take a closer look around 28 of that 54 would be energy for the car, 4 would be the normal base load for the house during that time, and around 23 is going into the Powerwalls; 23+4 = my house load that day would have cost &#xA3;4.05 (if we take around 15p per kWh average for my area) &#xA0;just for one day - not even taking into consideration the car.</p><p>In the summer you can pretty much say that my solar generation will top the Powerwalls up close to full every day and we&apos;ll never top them up from the grid (there&apos;s a mode called Self-powered in the Tesla Gateway) meaning our only cost for energy during the summer is from topping up the car.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://dan-jenkins.co.uk/content/images/2020/01/energy-2.png" class="kg-image" alt="Powering my House &amp; EV" loading="lazy"><figcaption>Energy drawn on the 14th August 2019</figcaption></figure><p>To be able to do all of this I rely on my energy provider - <a href="https://octopus.energy/?ref=dan-jenkins.co.uk">Octopus</a> and an energy plan called Octopus Go - 4 hours of energy at 5 pence per kWh - if you&apos;re interested in using Octopus you can use <a href="https://share.octopus.energy/ebon-louse-122?ref=dan-jenkins.co.uk">my referral link</a> and get yourself a &#xA3;50 credit on your account. They&apos;re what you call a <a href="https://octopus.energy/blog/greenwashing/?ref=dan-jenkins.co.uk">very green provider</a> which is nice to know too.</p><figure class="kg-card kg-image-card"><img src="https://dan-jenkins.co.uk/content/images/2020/01/flow.gif" class="kg-image" alt="Powering my House &amp; EV" loading="lazy"></figure><p>The long and short of it all is that while reducing my carbon footprint has become more important to me (we&apos;ll soon probably replace a Land Rover 4x4 with another EV); that&apos;s not really been the primary goal through all of this - saving money by using the right tech has been, it&apos;s just that a nice byproduct is becoming more green in our energy usage. Some of you may read this and say, sure you&apos;re saving money now but you&apos;ve spent X on a Model S, and Y on 2x Powerwalls and Solar panels and you&apos;re absolutely right; but ultimately that money in savings accounts really wasn&apos;t giving us much of a return and this reduces our monthly bills now and in around 9-10 years time, the Powerwalls and the solar panels would have paid for themselves - even quicker if you&apos;re in a different part of the world where you have sun all year round and electricity prices are much higher than what we have in the UK. </p><p>If you&apos;re looking at purchasing a Model S, X, 3 you can use my <a href="https://ts.la/dan60390?ref=dan-jenkins.co.uk">Tesla referral link</a> for some free supercharging.</p><p>Let me know if this is at all interesting on <a href="https://twitter.com/dan_jenkins?ref=dan-jenkins.co.uk">twitter</a>. I could talk about EVs, Battery tech, the in&apos;s and outs of charging curves all day but it&apos;s probably not that interesting to anyone unless they&apos;ve also got a Powerwall or an EV.</p>]]></content:encoded></item><item><title><![CDATA[I'll be at Open Source World (ITExpo)]]></title><description><![CDATA[<p>I&apos;m back to work after the Christmas break and I just wanted to say I&apos;ll be speaking at <a href="https://www.opensourcesummit.com/east/?ref=dan-jenkins.co.uk">Open Source World</a> at ITExpo 11-14 February talking about building Voice Bots with Dialogflow and Open Source Software readily available today. While the rest of ITExpo has never</p>]]></description><link>https://dan-jenkins.co.uk/ill-be-at-open-source-summit/</link><guid isPermaLink="false">6676e0a3bcdd230156168c48</guid><dc:creator><![CDATA[danjenkins]]></dc:creator><pubDate>Mon, 06 Jan 2020 11:29:33 GMT</pubDate><media:content url="https://dan-jenkins.co.uk/content/images/2020/01/itexpo-logo-1.png" medium="image"/><content:encoded><![CDATA[<img src="https://dan-jenkins.co.uk/content/images/2020/01/itexpo-logo-1.png" alt="I&apos;ll be at Open Source World (ITExpo)"><p>I&apos;m back to work after the Christmas break and I just wanted to say I&apos;ll be speaking at <a href="https://www.opensourcesummit.com/east/?ref=dan-jenkins.co.uk">Open Source World</a> at ITExpo 11-14 February talking about building Voice Bots with Dialogflow and Open Source Software readily available today. While the rest of ITExpo has never really interested me, the Open Source parts always have although you might find the expo halls filled with vendors and products interesting. ITExpo houses SD-WAN Expo, MSP Expo, The Blockchain Event and Channel Vision Expo.</p><figure class="kg-card kg-image-card"><img src="https://dan-jenkins.co.uk/content/images/2020/01/itexpo-logo.png" class="kg-image" alt="I&apos;ll be at Open Source World (ITExpo)" loading="lazy"></figure><p>Quite a few well known members of the Open Source Community will be there, Fred Posner, Allison Smith, Alan Percy and David Duffett to name only a few!</p><p>Let me know if you want to meet up and &#xA0;have a chat about how Nimble Ape can help your business.</p>]]></content:encoded></item><item><title><![CDATA[Giving thanks to OSS]]></title><description><![CDATA[<p>So 2019 is coming to a close and what a year it&apos;s been. This is the time of year to reflect on the previous 12 months; both the good and the bad. This year we&apos;re not just thinking about the past 12 months but the past</p>]]></description><link>https://dan-jenkins.co.uk/giving-thanks-to-oss/</link><guid isPermaLink="false">6676e0a3bcdd230156168c47</guid><dc:creator><![CDATA[danjenkins]]></dc:creator><pubDate>Fri, 20 Dec 2019 15:16:18 GMT</pubDate><media:content url="https://dan-jenkins.co.uk/content/images/2019/12/COMMCON_WEDNESDAY_20190710_MB370.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://dan-jenkins.co.uk/content/images/2019/12/COMMCON_WEDNESDAY_20190710_MB370.jpg" alt="Giving thanks to OSS"><p>So 2019 is coming to a close and what a year it&apos;s been. This is the time of year to reflect on the previous 12 months; both the good and the bad. This year we&apos;re not just thinking about the past 12 months but the past decade. It got me thinking about what I had to be grateful for when it comes to what drives my business forward every day. Nimble Ape will be going into it&apos;s 6th year in January - and we&apos;ve done some pretty amazing things in those 5 years - I&apos;m incredibly proud of what we&apos;ve achieved - we&apos;ve got clients all over the world, work on amazing projects and have run 2 of the best RTC conferences in the world with CommCon (I may be biased). But what and who have enabled me to do all of it?</p><h2 id="open-source-software">Open Source Software</h2><p>Open Source Software runs the world and I believe firmly in it&apos;s capabilities, as well as the pros of using software that has it&apos;s code openly available on the internet. Something I&apos;ve not been great at doing other than in conference talks is calling out the projects which enable Nimble Ape to operate and create amazing solutions for our clients. So here comes some of the projects we&apos;ve used heavily in the past year.</p><hr><h2 id="drachtio-rtpengine">Drachtio &amp; RTPEngine</h2><p>The <a href="https://drachtio.org/?ref=dan-jenkins.co.uk">Drachtio</a> project has been a huge part of Nimble Ape&apos;s offering for the past 2 years or so. If you don&apos;t know what Drachtio is, it&apos;s a combination of a C++ SIP Server and a Node.js framework (a node.js module) that enables people like me who love writing JavaScript (and more specifically Node.js) to use the ecosystem we love to create applications that handle SIP traffic in the manner we&apos;re used to. Think of it like Kamailio with a Node.js connector and a set of APIs that are simple for me to understand. Kamailio &amp; OpenSIPS are fantastic projects but they&apos;re very much over my head for anything more than dispatching SIP messaging to upstream Media Servers such as Asterisk or FreeSWITCH.</p><p><a href="https://github.com/sipwise/rtpengine?ref=dan-jenkins.co.uk">RTPEngine</a> is essentially a Media Proxy - able to take media in, be it plain RTP or encrypted WebRTC Media and proxy it somewhere else, and in the process transcoding it and &#xA0;decrypting or encrypting it. It also has an API which allows us to pass SDP into it and have RTPEngine munge it for us into what it needs to be for RTPEngine to accept the media.</p><p>In short, Drachtio &amp; RTPEngine are a core part of what forms the WebRTC B2BUAs we create for our clients at a fast rate of knots. &#xA0;I thank the SipWise team for RTPEngine and especially thank Dave Horton for Drachtio - it&apos;s completely changed how I build these kinds of applications.</p><h2 id="asterisk">Asterisk</h2><p>The <a href="https://www.asterisk.org/?ref=dan-jenkins.co.uk">Asterisk</a> project is very close to my heart having spoken at every Astricon since I joined the community 7 or 8 years ago. The Asterisk project (and more specifically Digium) is what allowed me to start Nimble Ape in the first place. Asterisk was just a PBX to me, covered up by FreePBX which was in turn covered up by Elastix. Asterisk is a now a toolbox with many capabilities at hand. Being able to build a media based application with media mixing, prompts, bringing in external media sources, forking a channels media out to an external application to handle transcription as well as creating a bot using something like Dialogflow is quite simply amazing. The Asterisk project has changed quite a bit over the past 6 years but it&apos;s all for the better. Thank you to the Asterisk team for continuing to improve a great toolkit.</p><h2 id="webrtc">WebRTC</h2><p><a href="https://webrtc.org/?ref=dan-jenkins.co.uk">WebRTC</a> has been a complete game changer for many reasons but a standard that allows encrypted media with a codec that&apos;s able to handle adaptive bandwidths at high quality with packet loss within a browser (as well as natively) has been amazing. Who would have known 8 years or so ago, playing about with WebRTC for the first time and failing massively, constantly asking for help on the Asterisk IRC channel, that WebRTC would be available on billions of devices around the world with online gaming driven by servers in the cloud; I&apos;d have told you you were bonkers. Its been a power for change and drives more and more applications every single month. Is 2020 the year we&apos;ll see it become a &quot;proper standard&quot;? Who knows.</p><h2 id="react-react-native-the-rtc-react-native-ecosystem">React, React Native &amp; the RTC React Native Ecosystem</h2><p>The react framework (that isn&apos;t a framework) has completely changed how we write web apps over the past 2 and a half years. The open source components available to stitch together that allow me to create amazing looking (and acting) applications without any real design skills is one of the best parts of the web - that&apos;s not just a react thing but to me, writing react applications is painless (or as painless as we can get - it&apos;s completely changed how I build applications with Web Tech). Whilst React is cool, <a href="https://facebook.github.io/react-native/?ref=dan-jenkins.co.uk">React Native</a> is the key game changer for me - being able to build amazing iOS and Android applications with Web Tech is hands down amazing. Yes I may complain about React Native every once in a while but its usually only for an hour or two. While React Native is amazing - it would be useless to me without the modules that empower the RTC apps we create. A special mention goes to the <a href="https://wazo.io/?ref=dan-jenkins.co.uk">Wazo team</a> and <a href="https://github.com/kylekurz?ref=dan-jenkins.co.uk">Kyle Kurz</a> for <a href="https://github.com/react-native-webrtc/react-native-callkeep?ref=dan-jenkins.co.uk">react-native-callkeep</a> and <a href="https://github.com/saghul?ref=dan-jenkins.co.uk">Sa&#xFA;l Ibarra Corretg&#xE9;</a> for leading <a href="https://github.com/react-native-webrtc/react-native-webrtc?ref=dan-jenkins.co.uk">react-native-webrtc</a> forward - both of the projects are vital to Nimble Ape.</p><h3 id="commcon-sponsors">CommCon Sponsors</h3><p>A quick mention to the businesses that have supported <a href="https://2019.commcon.xyz/?ref=dan-jenkins.co.uk">CommCon</a> for the past two years; without them the best event in RTC would never have happened. Throughout both years there&apos;s been a 4 constants within those sponsorship spots - <a href="https://simwood.com/?ref=dan-jenkins.co.uk">Simwood</a>, <a href="https://www.teluu.com/?ref=dan-jenkins.co.uk">Teluu</a>, <a href="https://www.lod.com/?ref=dan-jenkins.co.uk">LOD</a> and <a href="https://cloudonix.io/?ref=dan-jenkins.co.uk">Cloudonix</a>. Simwood have been Platinum sponsors both years and its fair to say without them I wouldn&apos;t have even considered doing the event in the first place. Teluu were Gold sponsors both years and on both occasions they were the first to sign contracts and send me money. LOD and Cloudonix were Community sponsors both years - giving money without very much in return - I thank all of them. Will CommCon be back in 2020? As a physical event in the UK, no; I need a break from organising it and I need to have a think about how to make the conference cover it&apos;s own costs. Will it be back in another form? There may be plans a foot - I&apos;ll know more in the new year.</p><h2 id="the-oss-rtc-community">The OSS RTC Community</h2><p>And finally... The OSS RTC Community - these are the people I trust in the community and talk to almost every day. They&apos;re the ones I use as soundboards when I&apos;m trying to figure out a solution to a problem or need inspiration from. Open Source is nothing without the people around it. Yes, software may be &quot;free&quot; but without contributing back to the community you&apos;re just a taker. If you use any Open Source project, and you don&apos;t already contribute to the project in some way, maybe have a think over the holiday break (if you have one) about how you can contribute back in some way. That doesn&apos;t have to be code, it doesn&apos;t have to be getting up and speaking at a conference and it doesn&apos;t have to be financial either. Its remarkable how many ways there are for someone can give back to the projects that empower them; how can you help? What&apos;s your skill? I thank every member of the RTC community for helping drive all of the solutions forward. You all know who you are.</p><hr><p>Thank you for 2019 and I&apos;m looking forward to what&apos;s to come in 2020.</p>]]></content:encoded></item><item><title><![CDATA[Screen-Sharing with Asterisk's SFU]]></title><description><![CDATA[<p>Whilst building the <a href="https://github.com/nimbleape/dana-the-stream-gatekeeper?ref=dan-jenkins.co.uk">Dana</a> project I wanted to add in the ability to screen share - it&apos;s pretty much a norm in any WebRTC conferencing application nowadays. What isn&apos;t so much the norm is the ability to send both your webcam video stream and your screen-share;</p>]]></description><link>https://dan-jenkins.co.uk/screensharing-with-asterisks-sfu/</link><guid isPermaLink="false">6676e0a3bcdd230156168c46</guid><category><![CDATA[asterisk]]></category><category><![CDATA[sfu]]></category><category><![CDATA[screen sharing]]></category><category><![CDATA[webrtc]]></category><dc:creator><![CDATA[danjenkins]]></dc:creator><pubDate>Tue, 17 Dec 2019 15:45:15 GMT</pubDate><content:encoded><![CDATA[<p>Whilst building the <a href="https://github.com/nimbleape/dana-the-stream-gatekeeper?ref=dan-jenkins.co.uk">Dana</a> project I wanted to add in the ability to screen share - it&apos;s pretty much a norm in any WebRTC conferencing application nowadays. What isn&apos;t so much the norm is the ability to send both your webcam video stream and your screen-share; usually applications will swap out your webcam feed for the screen-share; I dislike this entirely. For me, I&apos;m still explaining myself, I&apos;m still connecting with the other people in the conference call, I&apos;m still using hand gestures etc. But I hit a snag - it didn&apos;t seem to work with Asterisk&apos;s SFU. Why?</p><h2 id="getdisplaymedia">getDisplayMedia</h2><p>Let&apos;s take a step back here; how do we go about giving plugin and extension free access to Screen Sharing in WebRTC applications? We use the <em>navigator.mediaDevices.getDisplayMedia</em> API in the browser with some audio and video constraints; just like we would with <em>getUserMedia</em>.</p><p>Getting the stream should be as simple as </p><pre><code class="language-javascript">stream = await navigator.mediaDevices.getDisplayMedia({
    video: {
        cursor: &apos;always&apos;
    },
    audio: false
});</code></pre><p>You can see, I&apos;ve set some extra constraints in the video object so that the cursor is always captured during a screen-share. How many times have you gestured around something on screen while sharing your screen but the other users can&apos;t see what you&apos;re doing? Most WebRTC conferencing applications miss this vital addition out for some reason. You can also say yes to audio; in Chrome and Firefox (I&apos;m not sure about other browsers off the top of my head) this will allow you to share the audio from a Tab in the browser as an audio stream. The issue with Asterisk&apos;s SFU is that it mixes all the audio streams together so you can&apos;t send audio up that you&apos;d then get back again - causing a painful experience for all involved.</p><p>I hear you saying &quot;but surely it wouldn&apos;t come back, just like your voice doesn&apos;t come back to you because Asterisk is careful about what it mixes to who&quot;? Ah yes, this would be the case if Asterisk could handle multiple audio and video tracks in the uploaded stream to Asterisk - it can&apos;t and so to be able to do both webcam and screen-sharing we create an entirely new Peer Connection stating we don&apos;t want to receive any streams, and that we just want to send one. </p><p>So we create a new Peer Connection purely for streaming up a screen-share, and we have an existing Peer Connection with my webcam - woohoo, that&apos;s it and it&apos;s super simple? Unfortunately not. Asterisk has this need for audio, the very core of Asterisk wants audio and so even though Asterisk accepts this screen-share video stream, it isn&apos;t able to forward it on as it has no audio. How do we get around this? We generate silence using Web Audio.</p><pre><code class="language-javascript">_createSilence() {
    let ctx = new AudioContext(), oscillator = ctx.createOscillator();
    let dst = oscillator.connect(ctx.createMediaStreamDestination());
    oscillator.start();
    return Object.assign(dst.stream.getAudioTracks()[0], {enabled: false});
}

stream = await navigator.mediaDevices.getDisplayMedia({
    video: {
        cursor: &apos;always&apos;
    },
    audio: false
});

let silenceTrack = _createSilence();
stream.addTrack(silenceTrack);</code></pre><p>First we have this <em>silenceAudio</em> function, which creates a new Audio Context and an oscillator from that, we then create a new media stream for the silence to go to and we start the oscillator - without passing anything into it we just get silence. Pretty nice eh. We then return the Silence Audio track with the key of enabled set to true. We take that track and add it to the existing screen-share media stream. Magically, Asterisk now takes our video stream of the screen-share and forwards it on correctly. </p><p>We now have to deal with the fact we&apos;re receiving the screen-share that we&apos;re sending on the other Peer Connection but that&apos;s a trivial job of not showing it because we know it&apos;s our screen-share.</p><p>If you&apos;re interested in following along there&apos;s an <a href="https://issues.asterisk.org/jira/browse/ASTERISK-28655?ref=dan-jenkins.co.uk">active issue</a> in Asterisk&apos;s issue tracker for the issue of having to send audio even though we don&apos;t really have any</p><p><em>A big thanks to Lorenzo Miniero from the Meetecho team for helping me figure this one out!</em></p>]]></content:encoded></item><item><title><![CDATA[Dana & AudioServer - Transcription]]></title><description><![CDATA[<p>For years now we&apos;ve been asking for a new feature in Asterisk that enabled us to get a raw stream of audio out of Asterisk in a usable form that allowed us to integrate with speech to text engines, bot platforms etc and that became a possibility in</p>]]></description><link>https://dan-jenkins.co.uk/dana-audioserver-transcription/</link><guid isPermaLink="false">6676e0a3bcdd230156168c45</guid><category><![CDATA[asterisk]]></category><category><![CDATA[ari]]></category><category><![CDATA[google speech to text]]></category><dc:creator><![CDATA[danjenkins]]></dc:creator><pubDate>Mon, 16 Dec 2019 13:11:24 GMT</pubDate><media:content url="https://dan-jenkins.co.uk/content/images/2019/12/Screen-Shot-2019-12-12-at-14.08.13.png" medium="image"/><content:encoded><![CDATA[<img src="https://dan-jenkins.co.uk/content/images/2019/12/Screen-Shot-2019-12-12-at-14.08.13.png" alt="Dana &amp; AudioServer - Transcription"><p>For years now we&apos;ve been asking for a new feature in Asterisk that enabled us to get a raw stream of audio out of Asterisk in a usable form that allowed us to integrate with speech to text engines, bot platforms etc and that became a possibility in version 16.6 of Asterisk (<a href="https://blogs.asterisk.org/2019/10/09/external-media-a-new-way-to-get-media-in-and-out-of-asterisk/?ref=dan-jenkins.co.uk">https://blogs.asterisk.org/2019/10/09/external-media-a-new-way-to-get-media-in-and-out-of-asterisk/</a>).</p><p>Dana is built as a project to show developers how to go about building for Asterisk&apos;s SFU but I wanted to show off some other new abilities in Asterisk - namely External Media and ARI applications without Dialplan. So that&apos;s what I&apos;ve done. There are now two other projects on GitHub - a speech to text engine which takes audio from Asterisk and sends it to Google&apos;s Speech to Text engine, as well as a very simple ARI application to handle creating SFU conference bridges and spying on individual channels to get each individual participant&apos;s audio transcribed.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://dan-jenkins.co.uk/content/images/2019/12/Screen_Shot_2019-12-12_at_14_03_42.png" class="kg-image" alt="Dana &amp; AudioServer - Transcription" loading="lazy"><figcaption>Sa&#xFA;l Ibarra Corretg&#xE9;, Lorenzo Miniero &amp; Fred Posner joined me</figcaption></figure><p>By using MQTT to push transcriptions from the server down to the browser, we&apos;re able to get real time, word for word, transcriptions in Dana - its pretty damn cool.</p><hr><p>For a limited time check out this version of Dana at <a href="https://dana-meet.nimbleape.xyz/?ref=dan-jenkins.co.uk">https://dana-meet.nimbleape.xyz/</a> - you&apos;ll need to go into the settings and set your name. I&apos;ll keep it running for a couple of weeks but of course there are costs involved in using the transcription service from google, as well as the Vultr instance that&apos;s currently running Asterisk &amp; the two node services (If you want to enable this demo to remain running consider using my <a href="https://www.vultr.com/?ref=8347988-4F">referral link for Vultr</a>)</p><p>Want to go run it yourself? Check out the GitHub repos - <a href="https://github.com/nimbleape/dana-tsg-rtp-stt-audioserver?ref=dan-jenkins.co.uk">https://github.com/nimbleape/dana-tsg-rtp-stt-audioserver</a> and <a href="https://github.com/nimbleape/dana-tsg-ari-bridge?ref=dan-jenkins.co.uk">https://github.com/nimbleape/dana-tsg-ari-bridge</a> and you&apos;ll need a special branch of Dana for the time being - <a href="https://github.com/nimbleape/dana-the-stream-gatekeeper/tree/transcription-wip?ref=dan-jenkins.co.uk">https://github.com/nimbleape/dana-the-stream-gatekeeper/tree/transcription-wip</a></p><p>All 3 of the services use MQTT to link them together (purely because MQTT has a nice Websocket interface for the browser). Input your name into the settings and use any room name at all - we&apos;re using Asterisk&apos;s new ability to not have any Dialplan listed in extensions.conf and instead, against our WebRTC user, we&apos;ve set a context that&apos;s automatically created for ARI apps - so there&apos;s no dialplan extension matching string involved here - it just goes straight into our conference bridge ARI app - look at <a href="https://blogs.asterisk.org/2019/03/27/stasis-improvements-goodbye-dialplan/?ref=dan-jenkins.co.uk">https://blogs.asterisk.org/2019/03/27/stasis-improvements-goodbye-dialplan/</a> if you want more detail on this.</p><hr><p>The Audio Server is written in Node and takes data from a UDP socket, forms it into a Node.js stream and pipes it into a Google Speech to Text stream. Sounds simple doesn&apos;t it? It was far from simple due to the nature of UDP, asterisk servers outside of my network, needing to forward UDP packets over a UDP/TCP tunnel etc etc; let alone everything that George Joseph had already figured out in his implementation - <a href="https://github.com/asterisk/asterisk-external-media?ref=dan-jenkins.co.uk">https://github.com/asterisk/asterisk-external-media</a> - the key difference between George&apos;s and mine is the ability to handle multiple streams and that&apos;s done by getting the source port of the media out of asterisk and sending it to the audio server so we can say Participant X had a source port of 12345, associate their media with Bridge Y and Participant X. I can&apos;t wait until the TCP transport (or better yet, Websockets) is added to the External Media API - that should make developing solutions far easier (if you&apos;e OK with sending audio over TCP)</p><p>It&apos;s all been tested on Chrome only and there are still a few issues with video tracks - feel free to create issues on those GitHub repos on anything odd that you spot.</p>]]></content:encoded></item><item><title><![CDATA[Twitter, 2FA and WebAuthn]]></title><description><![CDATA[<p>TLDR; Go enable 2FA with either a security key or Authentication App on twitter today!</p><p>Twitter&apos;s had 2FA for a while - both via SMS and via an Authentication App such as Google&apos;s Authenticator or Authy. The difference to today is that you used to need</p>]]></description><link>https://dan-jenkins.co.uk/twitter-2fa-and-webauthn/</link><guid isPermaLink="false">6676e0a3bcdd230156168c42</guid><category><![CDATA[twitter]]></category><category><![CDATA[webauthn]]></category><category><![CDATA[2fa]]></category><dc:creator><![CDATA[danjenkins]]></dc:creator><pubDate>Mon, 25 Nov 2019 13:51:32 GMT</pubDate><media:content url="https://dan-jenkins.co.uk/content/images/2019/11/D89adpoX4AY-g_n.png" medium="image"/><content:encoded><![CDATA[<img src="https://dan-jenkins.co.uk/content/images/2019/11/D89adpoX4AY-g_n.png" alt="Twitter, 2FA and WebAuthn"><p>TLDR; Go enable 2FA with either a security key or Authentication App on twitter today!</p><p>Twitter&apos;s had 2FA for a while - both via SMS and via an Authentication App such as Google&apos;s Authenticator or Authy. The difference to today is that you used to need to add SMS 2FA to enable the authentication app 2FA and <a href="https://www.howtogeek.com/310418/why-you-shouldnt-use-sms-for-two-factor-authentication/?ref=dan-jenkins.co.uk">SMS isn&apos;t the best kind of 2FA in the world</a> - meaning someone who was able to get past your SMS 2FA could still take control of your account.</p><p>For me, 2FA of any kind was better than nothing at all - your attacker would have to do quite a bit to get past your SMS auth and so having the extra layer of security was important whether it was SMS auth or not. From Last week you can now use a new form of 2FA that uses a web standard underneath it, <a href="https://webauthn.guide/?ref=dan-jenkins.co.uk">WebAuthn</a>. That means you can now use your hardware security key instead of an Authentication app too.</p><figure class="kg-card kg-image-card"><img src="https://dan-jenkins.co.uk/content/images/2019/11/Screen-Shot-2019-11-25-at-11.44.22.png" class="kg-image" alt="Twitter, 2FA and WebAuthn" loading="lazy"></figure><p>You can choose which forms you want turned on; all 3, only 2 like I have or only 1 hardware key if you don&apos;t trust the fact your phone won&apos;t get lost (if you&apos;re using Google Authenticator) or that your Authy account won&apos;t get compromised.</p><p>While I firmly believe that a hardware security key is indeed best, there are still scenarios where I can&apos;t use mine and I believe that my Authy account is well enough protected thanks to password managers (which are in turn protected by my security key). Hardware security keys also cost money. One thing I don&apos;t understand is that if Twitter are utilising WebAuthn, why are they only allowing me to use my hardware security key and not the fingerprint reader on my phone or my macbook pro as allowed for in WebAuthn? - maybe because I&apos;d need to add multiple security keys for each device and they haven&apos;t allowed for that yet? It would be a brilliant way for everyone to suddenly get the strongest form of 2FA - yourself - your face, your fingerprint etc etc.</p><p>Well done Twitter for joining the likes of GitHub and Dropbox implementing <a href="https://blog.twitter.com/engineering/en_us/topics/infrastructure/2019/webauthn.html?ref=dan-jenkins.co.uk">WebAuthn</a></p><p>You can buy a hardware key from <a href="https://www.yubico.com/product/yubikey-5c?ref=dan-jenkins.co.uk">Yubico</a> for USB-A and USB-C as well as NFC and lightning connections. Or you can also buy from Google&apos;s <a href="https://cloud.google.com/titan-security-key/?ref=dan-jenkins.co.uk">store</a> who have a discount on Titan security keys on the date of publication. However, do know there are differences between the devices and what they support; if you want something that will work with everything, get the Yubico one - its $50 but it&apos;ll save you a tonne of hurt in the future.</p>]]></content:encoded></item><item><title><![CDATA[My thoughts on Google Cloud Contact Center AI]]></title><description><![CDATA[<p>TLDR; it&apos;s amazing but has a key issue for users not using commercial platforms.</p><p>I&apos;ve been lucky enough to have done some consulting work recently with one of the largest VoIP providers on the planet; where I got to play with the different aspects of CCAI</p>]]></description><link>https://dan-jenkins.co.uk/google-cloud-contact-center-ai/</link><guid isPermaLink="false">6676e0a3bcdd230156168c40</guid><category><![CDATA[google cloud]]></category><category><![CDATA[ccai]]></category><category><![CDATA[contact center ai]]></category><dc:creator><![CDATA[danjenkins]]></dc:creator><pubDate>Mon, 25 Nov 2019 10:52:25 GMT</pubDate><media:content url="https://dan-jenkins.co.uk/content/images/2019/11/1538298822.png" medium="image"/><content:encoded><![CDATA[<img src="https://dan-jenkins.co.uk/content/images/2019/11/1538298822.png" alt="My thoughts on Google Cloud Contact Center AI"><p>TLDR; it&apos;s amazing but has a key issue for users not using commercial platforms.</p><p>I&apos;ve been lucky enough to have done some consulting work recently with one of the largest VoIP providers on the planet; where I got to play with the different aspects of CCAI and I can truly say it&apos;s going to be a phenomenal tool in your toolbox. So what is it? Well let me tell you what it isn&apos;t - it isn&apos;t a contact center in the cloud run by Google because thats the first thing everyone asks; it&apos;s the intelligence you want in your contact center, on every single phone call, that pair of ears listening in and ultimately being able to report on what your customers want and feel.</p><p>From my point of view it&apos;s made up of two key parts - a Virtual Agent and an Assisted Agent. The Virtual Agent, you might have guessed, is a virtual agent which is powered by a knowledge-base inside of Dialogflow. If you don&apos;t know much about Dialogflow, simply put, Dialogflow is Google&apos;s conversation API; it can be used for voice as well as text and powers Actions on Google Assistant amongst other things.</p><p>Now, &quot;Virtual Agent&quot; has essentially been available with Dialogflow for a while - that&apos;s essentially what Dialogflow was - you sent it audio or text and it would come back to you with some audio/text and you&apos;d send it back to your user, rinse repeat and you have a Virtual Agent. Where Virtual Agent in CCAI is different is the ability to group the whole interaction under one &quot;Conversation&quot; and be able to draw context from within that conversation. However here&apos;s the really powerful part - you can now decide within your &quot;Intents&quot; in Dialogflow when to pass off to a human agent; essentially bringing that human agent in to an already existing conversation; you do this by having an intent that understands &quot;I want to talk to a human&quot; and for that action to pass you onto a human agent. Historically this is where Dialogflow&apos;s interaction with the end user might have stopped but now it remains in the conversation as an &quot;Assisted Agent&quot; continuing to listen in on the conversation between the agent and the customer.</p><p>This new Assisted Agent just went GA and , working in the VoIP industry, it&apos;s the thing I&apos;m most excited about coming from Google Cloud in the past few weeks. Having an Artificial Intelligence listening in on a call, transcribing it, keeping a record of the component parts of it, suggesting prompts to the agent in order to help the customer and then being able to query that retrospectively to improve things - that&apos;s just magical and for you, the end user, and your agents; the plus is that it&apos;s fairly easy to implement too, if you fill the right check boxes that is.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://dan-jenkins.co.uk/content/images/2019/11/Screen-Shot-2019-11-22-at-15.41.31.png" class="kg-image" alt="My thoughts on Google Cloud Contact Center AI" loading="lazy"><figcaption>CCAI explanation from Google Cloud</figcaption></figure><h2 id="so-what-s-the-key-issue">So what&apos;s the key issue?</h2><p>The CCAI service as a a whole is limited to Google&apos;s &quot;<a href="https://cloud.google.com/products/machine-learning/partners/?tab=tab2&amp;ref=dan-jenkins.co.uk">Contact Center Partners</a>&quot; which include Genesys, Avaya, Vonage, Cisco, Salesforce and Twilio amongst others. This means the complexity of handling media is left to the &quot;Contact Center Partner&quot; and you just deal with the logistics around the knowledge-base that informs decision making - hence making it fairly easy to get started. However, this is where I&apos;d love to see change; I know how many Contact Centers around the world are driven in-house by Open Source software such as Asterisk and FreeSWITCH with telephone &quot;lines&quot; provided by a wholesale provider who doesn&apos;t care what you&apos;re doing with that call. As a proponent of being able to choose the best software/hardware for a solution without needlessly having to pay a license fee on Contact Center solutions or higher per minute fees from say Twilio or Vonage, I&apos;d love for those businesses that decide they can do better for their business without a Contact Center Partner, to be able to deal with Google Cloud directly on this.</p><hr><p>From purely an operational perspective within your Contact Center, Google Cloud Contact Center AI&apos;s console will give you team leads, your QAs and business insight team members visibility into every call, with text analysis of every call you&apos;ll be able to automatically score each call from a QA perspective to make sure all those &quot;key points&quot; were indeed raised with the customer. Think of the hours you can save and repurpose elsewhere. </p><p>Right now due to how you need to operate with one of Google Cloud&apos;s Contact Center Partner&apos;s it&apos;s up to each provider to give you your data; how they do that is up to them and so theres no code to share here unfortunately. Maybe sometime in the future :)</p><p>Now, I haven&apos;t gone into huge detail about the other aspects of CCAI; if you&apos;re interested in how to get the most from CCAI have a read of the information on <a href="https://cloud.google.com/solutions/contact-center/?ref=dan-jenkins.co.uk">Google&apos;s Contact Center AI website</a> and if that doesn&apos;t answer your questions then feel free to email me at <a href="mailto:dan@nimblea.pe">dan@nimblea.pe</a> and I&apos;ll try to guide you down the right path. Nimble Ape offers consulting around RTC solutions and has first hand experience building voice bots with Dialogflow.</p>]]></content:encoded></item></channel></rss>