Search Results for "october 13"

About RSS

KALM-150x150"

The story of RSS is simple and yet combative. In fact RSS’s success may hinge on one man’s idealistic dedication to his principles. Tom takes you through the history of RSS.

Featuring Tom Merritt.

MP3

Please SUBSCRIBE HERE.

A special thanks to all our supporters–without you, none of this would be possible.

Thanks to Kevin MacLeod of Incompetech.com for the theme music.

Thanks to Garrett Weinzierl for the logo!

Thanks to our mods, Kylde, Jack_Shid, KAPT_Kipper, and scottierowland on the subreddit

Send us email to feedback@dailytechnewsshow.com

Episode transcript:

You probably use an RSS feed. In fact if you got this episode as a podcast you definitely used an RSS feed. Most people these days don’t even know they’re there. The story of RSS is simple and yet combative. In fact RSS’s success may hinge on one man’s idealistic dedication to his principles. If you’ve ever thought “why are people making this so complicated?” If you’ve ever wondered what it would be like to be a person who just shut everyone up with an action that for right or wrong would stand the test of time. Get ready to Know a Little More about RSS.

People say RSS stands for Really Simple Syndication though it really doesn’t. That’s one of the charms of the story of RSS. Throughout its formative years nobody could agree on much and the name is still a matter of debate to this day.
If you’ve heard of RSS at all, it was most likely in connection with Podcasts. Podcasts are delivered through RSS feeds to the apps and platforms where you can listen to them. Behind every Apple Podcast, Google Podcast, Audible Podcast and even most Spotify podcasts, there’s a simple RSS feed. You may also use RSS as a feed for headlines. If you use Feedly, NewsBlur or Inoreader or something like that you’re using RSS.
But where did RSS come from? Oh my friends. Be prepared for a tale of idealism, abandonment, betrayal and perseverance. It is the tale of RSS.
In the earliest days if you wanted to know if a website had been updated you had to visit it. As websites became more common this became a chore. So people experimented with ways to let you know when a website had been updated, without you having to go there. One of the earliest attempts at this was the Meta Content Framework or MCF, developed in 1995 in Apple’s Advanced Technology Group.
Ramanathan V. Guha was part of that group and a few years later, he moved over to browser-maker Netscape, where he and Dan Libby kept working on these sorts ideas. Guha particularly liked developing Resource Description Frameworks, or RDFs, similar to the old MCF he worked on at Apple. They were complex ways to show all kinds of things about web pages without having to visit them.
But Netscape’s team was of Guha, Libby and friends was not alone. And early on they weren’t he most likely to succeed. The Information and Content Exchange standard, or ICE, was proposed in January 1998, by Firefly Networks — an early web community company– and Vignette- a web publishing tool maker. They got some big names to back ICE too. Microsoft, Adobe, Sun, CNET, National Semiconductor, Tribune Media Services, Ziff Davis and Reuters, were among the ICE authoring group. But it wasn’t open source. In those days respectable tech companies like those I just named, still cast a skeptical eye on open source code. How were you supposed to make money on it? Who would keep working on it if they weren’t paid? So the members of the ICE authoring group paid people to develop it. And in the end that meant it developed slower than competing standards.
Interestingly, ICE’s failure caused Microsoft to get a little more open, a little earlier than you might expect. In 1997 Microsoft and Pointcast created the Channel Definition Format, or CDF. They released it on March 8, 1997 and in order not to fall under the death by slow development that ICE seemed to, they submitted it as as standard to the W3C the next day.
It was adopted quickly and in fact its success planted the seed of its successor. Dave Winer had founded a software company in 1988 called UserLand. UserLand added support for CDF on April 14, 1997 one month after its release. Winer also began publishing his weblog, Scripting News in CDF. But CDF, like ICE, was more complicated than a smaller site needed. So on December 27, 1997, Winer began to publish Scripting News in his own scriptingNews feed format as well. He just simplified CDF for his own needs and made that available for anyone who wanted to use it to subscribe.
Meanwhile Libby had been working away at his own version of a feed platform and Netscape was about to make a big launch that would cause his project to surpass them all. On July 28, 1998, Netscape launched My Netscape Portal, This was one of the earliest Web Portals. A place that aggregated links from sites around the Web. You could add sites you wanted to follow, like CNET or ZDNet and then see their latest posts all in one place.
Netscape kept the links updated with a set of tools developed by Libby. He had taken a part of an RDF parsing system that his friend Guha had developed for the Netscape 5 browser, and turned it into a feed parsing system for My Netscape. He called it Open-SPF at the time, for Site Preview Format.
Open-SPF let anyone format content that could then be added to My Netscape. It was rich like CDF, open like CDF but had one advantage over CDF. It worked on My Netscape, which suddenly everyone wanted to be on.
Netscape provided it for free because that meant the company didn’t have to spend time reaching deals for content. You want your content on My Netscape, use Open-SPF, it can be there. That meant there was more content available for My Netscape than was usual on curated pages. The content was free for both the users and Netscape. More content meant more users and more users meant Netscape could serve more ads. And content providers were willing to create the Open-SPF feeds, because they weren’t burdensome to create and the sites got more visitors who saw their content on My Netscape and clicked on links to come to their sites.
Sound familiar? This arrangement is the one Google still tries to rely on for Google News. Except the news publishers have changed tunes. Back then they were all about bringing visitors to their websites and happy that Netscape sent folks their way for free.. But as the years have passed and revenue has shrunk, now they’re more about getting Google to pay them for linking to their news.
Anyway back to the rise of Netscape.
1999 is not only the end of the millennium. It’s not only when everyone actually got to party the way Prince had been asking them to pretend to party. 1999 was a huge year for RSS. It was about to reach its modern form and become something users of RSS today would recognise. By name.
On Feb. 1, 1999 Open-SPF was released as an Engineering Vision Statement for folks to comment on and help improve.
Dave Winer commented that he would love to add Scripting News to My Netscape but he didn’t have time to learn Netscape’s Open-SPF. However because he had his own self-made feed format using XML he’d “be happy to support Netscape and others in writing syndicators of that content flow. No royalty necessary. It would be easy to have a search engine feed off this flow of links and comments. There are starting to be a bunch of weblogs, wouldn’t it be interesting if we could agree on an XML format between us?”
However by Feb. 22, Scripting News was publishing in Open-SPF and available at My Netscape. Feeling like it was a success, Libby changed the name of Open-SPF to refer to the fact that it used RDF, calling it the RDF-SPF format and released specs for RDF-SPF 0.9 on March 1. Shortly after release he changed the unwieldy name to RDF Site Summary, or RSS for short. Thus begins the first in a parade of meanings for RSS
And the new name took off. Carmen’s Headline Viewer came out on April 25th as the first RSS desktop aggregator and Winer’s my.UserLand.com followed on June 10th as a web-based aggregator.
Folks liked the idea obviously, but a lot of RSS enthusiasts thought the RDF was too complex, Dave Winer among them. Libby hadn’t ignored Winer’s earlier offer either. In fact, Libby thought they weren’t really using RDF for any useful purpose. So he simplified the format adding some elements from Winer’s scriptingNews, and removing RDF so it would validate as XML. This was released on July 10, 1999 as RSS 0.91.
Some folks write that the name changed to Rich Site Summary at that point but Winer wrote at the time “There is no consensus on what RSS stands for, so it’s not an acronym, it’s a name. Later versions of this spec may say it’s an acronym, and hopefully this won’t break too many applications.”
Anyway by 1999, like Toy Story, RSS is on a roll. Libby is bringing in feedback from the community and creating a workable usable standard that is reaching heights of popularity beyond just the confines of My Netscape.
Like some kind of VH1 Behind the Music story, as it reach that’s height, everything fell apart.
Netscape would never release a new version of RSS again.
In the absence of Netscape’s influence, two competing camps arose.
Rael Dornfest wanted to add new features, possibly as modules. That would mean adding more complex XML and possibly bringing back RDF.
Dave Winer preached simplicity. You could learn HTML at the time by just viewing the source code of a web page. Winer wanted the same for RSS.
On August 14, 2000, the RSS 1.0 mailing list became the battleground for the war of words between the two camps.
Dornfest’s group started the RSS-DEV Working Group. It included RDF expert Guha as well as future Reddit co-founder Aaron Swartz. They added back support for RDF as well as including XML Namespaces. On December 6, 2000 they released RSS-1.0. and renamed RSS back to RDF Site Summary.
Not to be left behind, two weeks later On December 25, 2000, Winer’s camp released RSS 0.92.
Folks, grab your steaks knives. We have a fork.
In earlier days, Libby, or someone at Netscape, would have stepped in. In But AOL had bought Netscape in 1998 and had been de-empahasizing My Netscape. They wanted people on AOL.com. And if they didn’t care about Netscape, they cared even less about RSS. In fact they actively did things that could have ended RSS. In April 2001, AOL closed My Netscape and disbanded the RSS team, going so far as to pull the RSS 0.91 document offline. That document was used by every RSS parser to validate the feeds. Suddenly all RSS feeds stopped validating. Apparently this had little effect on visitors to AOL.com or people dialing in to their internet connection, so AOL just let them stay broken. With the RSS team gone and AOL doing nothing, RSS feeds were looking dead in the water.
But the RSS 0.91 document was just a document after all. And there were copies. Anybody theoretically could host it as long as everyone else changed their feeds to validate to the new address. Dave Winer stepped up.
Winer’s UserLand stepped in and published a copy of the document on Scripting.com so that feed readers could validate. That right there won Winer a lot of good will.
An uneasy truce followed. Whether you were using Netscape’s old RSS 0.91, Winer’s new RSS 0.92 or the RDF Development Group’s RSS 1.0 they would all validate.
By the summer of 2002, things are going OK and tempers have cooled. Nelly has a hit song advising folks what to do if things get hot in here. Maybe we can solve this? Let’s try to merge all three versions into one new version we can all agree on and call it RSS 2.0. right?
Except they couldn’t agree. Winer still wanted simplicity. RDF folks still wanted RDF and the fun features it would bring. They would agree to a simplified version of RDF but they still wanted it. To make matters more confusing, Winer was discussing what should happen by blog, with everyone pointing to their own blogs. The RDF folks were talking about it on the rss-dev mailing list.
Communication, oddly in a discussion about a communication platform, was the problem. Since neither side was seeing each other’s arguments they never came to an agreement. So Winer’s group decided not to wait. On September 16, 2002, UserLand released their own spec and just went and called it RSS 2.0. AND Winer declared RSS 2.0 frozen. No more changes.
Discussions continued on the RSS-dev list but Winer’s camp got another victory when in November 2002, the New York Times adopted RSS 2.0. That caused a lot of other publications to follow suit. Further consolidating the position.
The next year in another move fending off the debate, on July 15, 2003, Winer and UserLand assigned ownership of RSS 2.0 copyright to Harvard’s Berkman Center for the Internet & Society. A three-person RSS Advisory board was founded to maintain the spec in cooperation with the Berkman Center which continued the policy of considering RSS frozen. Mic. Dropped.
There was still a resistance. IBM developer Sam Ruby set up a wiki for some of the old RDF folks, and others, to discuss a new syndication format to address shortcomings in RSS and possibly replace Blogger and LiveJournal’s protocols. The Atom syndication format was born of this process and was proposed as an internet official protocol standard in December 2005. Atom has a few more capabilities and is more standard compliant, being an official IETF Internet standard, which RSS is not. But in practice they’re pretty similar. Atom’s last update was October 2007 but it is still widely supported alongside RSS.
And RSS 2.0 kept going. In 2004 its abilities to do enclosures, basically point to a file that could be delivered along with text, led to the rise of Podcasts. Basically RSS feeds that pointed to MP3 files.
In 2005, Safari, Internet Explorer, and Firefox all began incorporating RSS into their browser’s functions. Mozilla’s Stephen Hollander had created the Web Feed icon, the little orange block with a symbol like the WiFi symbol at an angle. It was used in Firefox’s implementation of RSS support, and eventually Microsoft and Opera used it too. It was also used for Atom feeds. Stephen Hollander did what most could not. Get people interested in providing automated Web feeds to agree on something.
And in 2006, with Dave Winer’s participation, RSS Advisory Board chairman Rogers Cadenhead relaunched the body, adding 8 new members to the group in order to continue development of RSS.
Peace in the form of an orange square was achieved.
OK. So RSS has a colorful history. What the heck does it do?
That part is pretty simple. It’s a standard for writing out a description of stuff so that it’s easy for software to read and display it.
Basically you have the channel (or Feed in Atom) and Items (or entries in Atom).
RSS 2.0 requires the channel to have three elements, the rest are optional. So to have a proper feed you need a title for your channel, a description of what it is and a link to the source of the channel’s items.
Like Daily Tech News Show – A show about tech news. And a link to dailytechnewsshow.com
Optional elements of RSS are things like an image, publication date, copyright notice, and even more instructions like how long to go between checking for new content and days and times to skip checking.
The items are the stuff in the feed. There are no required elements of an item, except that it can’t be empty. It has to have at least one thing in it. So an item could just have a title or just have a link. However most of the time an item has a title, a link and a description. The description can be a summary or the whole post. Other elements of the item include author, category, comments, publication date and of course enclosure.
So for our Daily Tech News Show example title might be Episode 5634 – AI Wins, the description might be “Tom and Sarah talk about how AI just won and took over everything.” And the link to the post for that episode.
The enclosure element lets the item point to a file to be loaded. The most common use for the enclosure tag is to include an audio or video file to be delivered as a podcast.
For Daily Tech News Show that would be a link to the MP3 file.
In the end an RSS reader or a podcast player looks at an RSS feed the way your browser looks at a web page. It sees all the titles, links descriptions and possible enclosures, and then loads them up and displays them for you.
After a rather stormy opening decade, RSS has settled down into a reliable and with apologies to team RDF, simple way of syndicating info. Really Simple Syndication indeed.
Like podcasting which it provides the underpinnings to, RSS has been declared dead several times. But it just keeps on enduring. I hope you have a little appreciation for that tiny file that delivers you headlines and shows now. In other words, I hope you know a little more about RSS.

About Bluetooth LE Audio

KALM-150x150"

Tom breaks down the Bluetooth LE Audio profile, why it’s needed, and when and where you can expect to see it in your devices.

Featuring Tom Merritt.

Sites mentioned during this episode:
Plugshare
A Better Route Planner
SMR Podcast
BBQ and Tech

MP3

Please SUBSCRIBE HERE.

A special thanks to all our supporters–without you, none of this would be possible.

Thanks to Kevin MacLeod of Incompetech.com for the theme music.

Thanks to Garrett Weinzierl for the logo!

Thanks to our mods, Kylde, Jack_Shid, KAPT_Kipper, and scottierowland on the subreddit

Send us email to feedback@dailytechnewsshow.com

Episode transcript:

I already have Bluetooth and I have to say I’m not that impressed.

But now they say there’s Bluetooth LE Audio for my music?

Should I trust this?

Confused?

Don’t be. Let’s help you Know A Little More about Bluetooth LE Audio.

Bluetooth LE Audio is an implementation of Bluetooth LE with a focus on audio quality. So what is Bluetooth LE you might ask.
Bluetooth LE stands for Bluetooth Low Energy. It’s technically separate from the regular Bluetooth spec but it’s also administered by the Bluetooth Special Interest Group ( you can listen to our episode on Bluetooth 5 for more on that. Bluetooth LE uses the same frequency as Classic Bluetooth, 2.4 Gigahertz, and it can share a radio antenna, so the two specs are often implemented together. In other words your phone, Bluetooth speaker or headphones might have one or both. But your earbuds are the most likely to have Bluetooth LE in order to save on battery life.
Let’s take a quick dip into where it came from. Nokia researchers adapted Bluetooth in 2001 for low power use and in 2004, published something they called Bluetooth Low End Extension. Continued development with Logitech and other companies including STMicroelectronics led to a public release under the name Wibree in October 2006. The Bluetooth SIG agreed in June 2007 to include Wibree in a future Bluetooth spec, but sadly did not agree to keep the name. It was integrated into Bluetooth 4.0 as Bluetooth Low Energy and marketed as Bluetooth Smart. The iPhone 4S was the first smart phone to support it in October 2011. The Bluetooth Smart name was phased out in 2016. It’s now just called Bluetooth LE, currently grouped under Bluetooth 5. So yes it’s *technically* not the same but it essentially works like a mode of classic Bluetooth. (Pause for shouting of people who say it’s not like that at all, really).
Bluetooth SIG defines profiles for both Bluetooth Classic and Bluetooth LE, basically a definition of how it works for a particular application. One of the LE profiles is the Mesh profile which lets LE devices forward data to other LE devices to create a mesh network. There are a lot of profiles including battery, proximity sensing, internet connectivity, and… tada! Audio.
Since for many folks, Bluetooth means headphones and speakers, the official publication of the Bluetooth LE Audio profile got a lot of attention.
Bluetooth LE Audio’s protocol defines features that expand what low-power devices can do, specifically for audio.
Some of what Bluetooth LE Audio can do already exists. Qualcomm’s aptX Adaptive or Sony’s LDAC codecs offer high quality audio compression and low latency. You just need to pay Qualcomm or Sony to use them. Or you could engineer your own proprietary solution. Which costs you the time to research and develop it.
But you don’t need any of that anymore.
Bluetooth LE Audio will support up to 48 kHz, 32-bit audio at bitrates from 16 to 425 Kbps, with 20-30millisecond latency versus Bluetooth Classic’s 100-200 millisecond latency, all while going easy on your battery. And it costs a manufacturing company exactly nothing to implement. The magic of industry standards.
Instead of LDAC or aptX, Bluetooth LE Audio uses the LC3 codec. It can deliver higher quality audio at the same bitrate as Bluetooth Classic’s SBC codec and SIG claims it can do better audio than SBC at half the bitrate. That means higher quality audio will use less power.
There’s also a feature called Auracast which lets unlimited audio devices – called “sinks” in the parlance – but we’re talking about speakers and headphones what have you – connect to a single audio source. For instance everybody in the gym could connect their wireless headphones to a single TV. Everybody in a theater could wear earbuds to get improved movie audio. Users can select Auracasts like they would a WiFi network. Depending on how the OS handles them they’ll probably show up as a little list of ‘Auracast Broadcasts” and you would select from the list the one you want to hear. Auracast also supports connections by QR code and NFC, so that may be an option sometimes too. And yes, Auracasts can be password protected if you just want to share with friends, and those can show in the list with a lock.
Here’s another example SIG gives, airport gate announcements. Let’s say you’re at gate C17. There could be an auracast just for gate C17 and then a password-protected one for the gate agent. That way the gate agent hears the airport announcements meant for them and you hear just the announcements for your gate and don’t get confused by that “board now” announcement from C18 next door.
Now you may be wondering how you’ll see that list of Auracasts on your small earbuds. You’ll need an “assistant” most often a smart phone- though I suppose it could be a laptop or tablet or some such thing. On the “assistant” device you select the broadcast. The assistant then passes on the information to connect to the headphones or speaker which then connect to the broadcast directly. You won’t be dragging down the battery of the assistant device after that.
A few other notable features.
Bluetooth LE Audio also lets each earbud maintain its own connection with the source device. Right now with Bluetooth, only one earbud connects to the source device and then somehow passes along the audio to the other earbud. This is a tricky thing since the head blocks most wireless connections so you have to find a way around it. That’s why the first wireless headphones always wired the two earphones together. Apple lets each earbud in its airpods make a direct connection to an iPhone but that method is proprietary. With Bluetooth LE Audio, more earbud makers can do it as part of the spec. Word is Apple will adopt Bluetooth LE Audio as well, whether they use it for this feature or not.
This should also make it easier to switch between audio source devices.
Bluetooth LE Audio is also better at managing packet loss when you’re at the edge of the range. Bluetooth LE– without the Audio profile– tries to make sure every packet arrives. And if it can’t, it terminates the connection and then reconnects and starts over. This is a good thing when you want to get every bit in a file. For audio streams though, it means your audio cuts out more when you’re at the edge of the range. Bluetooth LE Audio, since it’s specific to audio, takes a different approach to packets. It limits the time a packet has to to be retransmitted so that once it’s too old to matter, you know it’s the “oooh-we-oo” from the last verse– it gets tossed aside and doesn’t cause the stream to be interrupted. The new LC3 audio codec can compensate for this packet loss so you don’t hear skipping. And it should work the way Qualcomm’s aptX Adaptive or Sony’s LDAC codecs have worked over Bluetooth up until now, by reducing the quality a little until the connection gets stronger. So basically instead of skips and dropouts at the edge of the range you may just get slightly tinnier sound.
And praises be, Bluetooth LE Audio supports hearing aids and implants. A huge benefit for devices that really do need battery efficiency. Bluetooth SIG expects most new phones and TVs to be hearing loss accessible within the next few years.
So how can you get it?
Some devices may be able to support Bluetooth LE Audio with a software upgrade so check with your manufacturer to see. In fact Android 13’s beta supports it already. But will they? Maybe? However, maybe not. Your best bet is to check for new devices to see if they support it.
You should be able to find more and more products supporting Bluetooth LE Audio over time. In other words, I hope you know a little more about Bluetooth LE Audio.

About QR codes

KALM-150x150"

Tom explores the history, usage, and possible dangers of QR Codes.

Featuring Tom Merritt.

MP3

Please SUBSCRIBE HERE.

A special thanks to all our supporters–without you, none of this would be possible.

Thanks to Kevin MacLeod of Incompetech.com for the theme music.

Thanks to Garrett Weinzierl for the logo!

Thanks to our mods, Kylde, Jack_Shid, KAPT_Kipper, and scottierowland on the subreddit

Send us email to feedback@dailytechnewsshow.com

Transcript:

I went to a restaurant and they said their menu was a little box full of boxes.
How am I supposed to read that.?
Someone said point my phone at it?
Confused?
Don’t be, let’s help you Know A Little More about QR Codes.

The “QR” in QR code stands for Quick Response code. It was invented by Masahiro Hara of the Denso Wave subsidiary of Japan’s Denso automotive parts company in 1994. He was inspired by the black and white patterns created when playing the game Go. The original application of the QR code was to identify parts in auto manufacturing at high speed.
The QR code is a type of 2D or matrix barcode, as opposed to the widespread UPC bar code you see a lot of, which is considered a 1D bar code. A 1D bar code is read in one dimension. So with UPC a laser horizontally the series of varying widths of black and white bars. Whereas a 2D barcode is read vertically and horizontally and uses rectangles, dots, hexagons and other patterns.
The big advantage of a 2D bar code is it can hold more information and deliver it quicker than a 1D bar code.
A QR code uses black squares called data modules arranged in a square grid on a white background. The background should extend outside the square in what’s called a “quiet zone” to make it easy to detect what’s actually part of the QR code’s matrix. You can encode four standard types of input data or “encoding modes:” numeric, alphanumeric, byte/binary and kanji.
The maximum amount of information you can encode depends on which of these inputs you’re using as well as your level of error correction and the dimensions of the grid. Grid dimensions are described by a level number from 1- 40. With level 1 having 21 by 21 data modules and each level adding 4 until you get level 40 with 177 x 177 data modules.
Maximum capacity can be found with the 40-L numeric encoding which encodes just numbers at the maximum dimensions of the grid with the lowest error correction. It can hold 7089 characters. The Alphanumeric version of the same thing holds 4,296 characters. Most QR codes you see in everyday life are around versions 2-5 and usually hold between 20 to 100 characters, enough for a shortened URL.
Because a QR code is two dimensional you need an image sensor to detect it. Since almost every phone now has a camera, the phone has become the most familiar way QR codes are scanned.
A Reed-Solomon error correction process is used to interpret the pattern. Reed-Solomon is also used in CDs, Blu-ray Discs, DSL and RAID 6. In QR codes there are four levels of error correction L is the lowest restoring approximately 7% of data, M is the middle at 15%, Q is the next up at 25% and H is the highest at 30%. This is going to offend statisticians and data professionals but you can roughly think of it as if up to 7% of the data is damaged the L error correction will still let you read the data. In practice most QR codes seem to use M. I guess they assume if more than 15% of that sticker is damaged you might as well get a new menu sticker.
But let’s get into how that pattern of blocks gets turned into your restaurant menu or wifi password or name of a conference room or whatever. The whole QR code is made up of just those blocks, called data modules, either black squares or empty white spaces.
You might have noticed there are always three distinctive larger squares in the corners of a QR code, Those are position markers. They are used along with a smaller square or set of squares in the fourth corner to calibrate the size, orientation and angle in which the pattern is being viewed.
Now your QR code reader, likely your phone’s camera, knows where the code is and can adjust for how big it looks in your camera. It can even do these adjustments on the fly as your unsteady hand wavers over the restaurant table.
Next it needs to know some things about what kind of encoding and error corrections and such were used. This way it can interpret the data correctly.
The mode indicator is placed in the bottom right indicating the input type. Other format information like error correction quality and character count is placed near the three squares. These are done as a sequence of 4 bit indicators.
That stuff is always the same and lets the reader know whether to look for numbers, alphabets kanji whatever and how much will be redundant error correction code
Now it’s time to read the whole point of this exercise. The data. The thing. The link to the menu. The kind of auto part this is. The WiFi Password!
In the space remaining after the position markers and format data, the encoded data is placed from right to left in a zigzag pattern until it reaches an end indicator. The amount of bits used for your data varies by the type of input. So numbers can get 3 digits into 10 bits, alphanumeric gets 2 characters into 11 bits and so on. You can even switch encoding types if you need to. Just throw in another 4-bit indicator.
You often need to mix input types because alphanumeric can only do capital case and 8 punctuation marks. So to do anything beyond that you need to use bytes which takes up more bits.
And that’s it, once the reader has interpreted all that it has the data and then the reader goes from there whether that’s showing you a URl you can tap or a wiFi password you can enter or the name “brake pad.”
You may wonder who keeps track of how that all works so that every reader works with every QR code.
QR codes have been standardized multiple times over the years. The first time was in October 1997 issued by the Association for Automatic Identification and Mobility, followed by one in January 1999 from JIS or Japanese Industrial Standards. And then the heavy, the International Standards Organization, or ISO issued its first standard in June 2000 and most recently updated it on February 1st, 2015.
Denso freely licenses QR code tech as long as users follow either the JIS or ISO standards. While Denso holds patents on the technology, it waived its rights for standardized codes and its patents in the US and Japan have already expired.
Denso does still hold the trademark on the name QR code and maintains some proprietary, non-standard implementations. But the ones you mostly see are standards-compliant.
You probably figured this out but QR codes are static. Once they’re printed, they don’t change. Even if you made an animated GIF of a QR code, the reader would just keep trying to show you the latest one. Once you make a QR code it’s meant to stay that way. Which makes them great for permanent information, which is why they were very good at parts identification. This is a shock absorber and we have very little expectation it will suddenly become a brake pad so we can slap a QR code on it so the assembly robots know what it is.
However at some point folks had the bright idea to encode URLs into QR codes. Why not? URL’s are just alphanumeric strings after all. Now, the URLs are still static. But any URL can be made to point to a different thing over time by redirecting it. Knowalittlemore.com for instance points to the ACast site where the podcast lives. I could change that to point to the daily tech news show blog posts about the show instead, if I wanted. So URLs sort of bring in the idea of a dynamic QR code, and so some people refer to static vs. dynamic QR codes. Let’s be clear, they’re all static. So when someone says a QR code is dynamic, it just means it has a URL. The code itself isn’t actually dynamic. But it points to a URL that you know you can redirect to different things. This is helpful for say, a restaurant that changes its menu.
It is also helpful for malicious types who want to do crimes and other malicious behavior.
As I think is clear by now, QR codes themselves are not risky as they only hold static data. QR code readers when working properly would prevent unauthorized executions of that data and there’s not a lot of leeway to make a very capable executable anyway. So the bigger worry is the URL. The practice of encoding URLs in QR codes is widespread, dare we speculate it is the norm, and that means the same risks that come in clicking any URL anywhere come with QR codes. One weakness could be a third party QR code reader that let its permission down a little. But even the most buttoned-down from the OS manufacturer built into the camera app QR code reader –can just take you to a malicious site like any email text message or link on the web.
As such you should only scan QR codes if you’re certain of the source. QR code stickers out in the world might be legitimate or might have been stuck there by someone malicious, possibly over the legitimate code. This doesn’t mean you should never scan a QR code in public but use secure QR code readers and look carefully at the link you’re being sent to before tapping it.
Some malicious links can look to you like they operate normally while engaging in malicious behavior like accessing your browser history or sending text messages without your knowledge.
It’s also good to doublecheck the URL after you tap to make sure it took you where you expected to go. Don’t just look at the graphics or the site layout, those can be faked. And resist the urge to log in, pay for something or download an app from a QR code link. Those are all popular scam vectors. There are legitimate times to use QR codes for that, but you need to be very sure about the legitimacy of the code before you do any of those.
And finally keep in mind that while the actual scanning of a QR code leaks no data, using a QR code to go to a website exposes all the same kinds of data as any visit to a website. Like your IP address, kind of browser and device, etc. This is no worse than browsing the web mind you, but something to keep in mind.
Finally , there are a few variations on the QR code you may encounter.
The Micro QR code holds a very small amount of info but doesn’t take up space so it’s often used on small items. It only has one positioning square in the upper left corner.
Denso Wave has a proprietary version called the IQR code that can be square or rectangular. It works well on cylindrical objects and holds more information than the standard QR Code.
And Frame QR codes take advantage of the error correction process to allow for a canvas area that can be used for logos, graphics etc.
QR codes are just bug dumb links in the world made of squares. Treat them like any big dumb link you’d find anywhere.
In other words, I hope you know a little more about QR codes.

About CBDCs

KALM-150x150"

Tom breaks down Central Bank Digital Currencies and why they are not the same as cryptocurrency.

Featuring Tom Merritt.

MP3

Please SUBSCRIBE HERE.

A special thanks to all our supporters–without you, none of this would be possible.

Thanks to Kevin MacLeod of Incompetech.com for the theme music.

Thanks to Garrett Weinzierl for the logo!

Thanks to our mods, Kylde, Jack_Shid, KAPT_Kipper, and scottierowland on the subreddit

Send us email to feedback@dailytechnewsshow.com

Episode Transcript:

I’ve barely wrapped my head around Bitcoin and now you tell me the government is getting into it?
There’s something called Seabee Geebies or CBDCs??
Is cash going away?
Are you confused?
Don’t be.
Let’s help you Know a Little more about Central Bank Digital Currencies or CBDCs

Central Bank Digital Currencies or CBDCs are digital currencies issued by a government institution– usually a central bank– an alternative to– not a replacement for– an alternative to coins or printed money. They’re sometimes called digital base money or digital fiat currency. Fiat currency is the name for the money most governments issue. Side note, fiat means issued by order or decree. The term fiat money arose to distinguish it from money backed by something like gold or silver.
CBDCs are often compared to cryptocurrencies because they’re digital, but that’s a confusing comparison because they have different aims. Cryptocurrencies are generally meant to be independent, and often are decentralized, where no one entity controls the servers the system runs on. Even when a cryptocurrency is centralized it’s generally meant to be independent of governmental institutions. And cryptocurrencies are often seen as an investment. CBDCs are no more or less of an investment than the country’s fiat currency.
CBDCs are also different from a particular kind of cryptocurrency called a stable coin. Stable coins are usually linked to a stable fiat currency so the value doesn’t fluctuate any more than the currency it’s based on. One stable coin linked to the Euro would always be worth one Euro. These are closer to a CBDC but they’re not issued by the central bank.
So I think about government digital currency less as a government form of cryptocurrency and more like a digital driver’s license or a virtual transit pass. It’s something the government creates in a digital form instead of physical form.
And most of the CBDCs in development don’t use a blockchain or any kind of distributed ledger. In many ways they’re more like the money in your bank account than they are like bitcoin. Except even though you only see your bank account money as a number on your bank’s website or app, somewhere– allegedly– there’s a stack of paper money representing the balances in that bank. With CBDCs there would not be.The digital currency would be the same as the paper currency not just a digital record that it exists.
So if they aren’t cryptocurrencies exactly and they aren’t necessarily using a blockchain and aren’t even backed by paper money what are they and how do you get off calling them money?
Let’s figure out how this works.
Whatever system a central bank uses, a CBDC will rely on the consumer having a digital wallet. This will most often be done in software on a mobile device, but could also be done in hardware like a smart card or some other kind of smart dongle.
The wallet would authenticate the user some robust way. This could be by PIN, password, or biometrics. Most CBDCs contemplate using the FIDO alliance password-less strong authentication. See our episode on FIDO for more on that.
Wallets would also need to authenticate parties in a transaction, whether sending or receiving currency. This can be done with public and private key exchanges (we have an episode on that too) between two wallets or with a central database.
A few central banks are considering using a distributed ledger like a blockchain. It would still be centrally controlled but would include the security of the blockchain and be an easy way to get a system running. But it also introduces some complexity that’s not necessary for this system. You don’t need to avoid centralized control, which is one of the main aims of a blockchain.
So most Central Bank Digital Currencies being developed use a token-based system. Tokens are protected with strong encryption from being duplicated– kind of like Bitcoin– and then recorded in a database under the control of the government, usually the central bank. The bank itself may run the database or it may contract a private entity to do it for them, but the government is in charge, not the private entity.
The database keeps a record of any entity, people, companies, government organizations etc, that hold the digital currency. So you could have an account which tracks the balance in your CBDC wallet from which you could pay others or accept payments or deposits.
There isn’t one settled way to run a CBDC yet. One thing they all have in common though, is the need for strong cryptography to keep each unit of the currency from being copyable. And as far as payments and transactions, there’s a lot of security already built into the current system– like in point of sale units– that can be adapted for CBDCs.
That sounds like a lot of work. Why do that? Why not just keep the system we have now? It works, right?
Well CBDCs, like blockchain-based cryptocurrencies that inspired them, would be way more efficient. Right now when you pay someone using a bank or a credit card there are dozens of entities involved in the transaction. The point of sale system talks to a credit authorization system which communicates with a payment processor which talks to a clearing house which talks to a bank. And that’s a major oversimplification. That’s why money transfers can take up to three days.
With a CBDC there’s one entity, the central bank, that does the transfer, in real time from you to the person you’re paying. This reduces risk because you know immediately if the payment was successful. It makes accounting easier because you don’t have a lot of stuff you’re waiting to clear through the banking system. And it eliminates fees since there are no organizations in the middle taking a cut.
And because your account/wallet holds your actual digital currency a run on the bank would not cause your money to be unavailable because the cash isn’t in the vault.
Another benefit is that CBDCs are often promoted as a solution for the unbanked. Banks need to develop and maintain infrastructure to provide access to its financial system. This involves verifying identities, creating credit cards and debit cards, offering ATMs etc. CBDCs could just be run on a phone with a connection to the CBDC database. So instead of having to apply for a bank account, every citizen could get a wallet or account from the central bank through a phone. This could be done with an app but as has been shown by systems like M-Pesa in Kenya could also operate over text messaging. 89.9% of people own at least one mobile phone, that’s 7.1 billion people. And even for those who don’t have or don’t want to use a phone for CBDCs, cards similar to transit cards can be created to act as digital wallets for the digital currency.
And then there’s a benefit that’s also a downside. Tracking. Every transaction is recorded which helps governments collect taxes and combat crimes like money laundering. But also means the government knows every transaction making people uneasy, especially if they don’t trust their government.
Another downside with an upside, is that CBDCs could take away a kind of revenue from banks, causing them to have to shift their business models. A downside for the banking industry but possibly an upside for consumers who might benefit from increased banking competition to get you to use them for deposits and loans. They’d have to offer you new features to convince you, versus now where you feel like you have to use them because your alternative is sticking your money in a mattress or burying it in a jar out back.
And of course the one main downside to CBDCs, centralized control. Bad actors within a government might be tempted to abuse their control to punish political opponents or activists by removing money or access. More often and more likely are the privacy and security issues faced by ISPs and current banks. The central bank would become a prime target for attackers looking to crack into the database and steal money or information.
Up until now I’ve been describing what are called Retail CBDCs. The money us regular folks use in day to day life. There’s also something called a Wholesale CBDC. These would be used for payments between central banks or between any banks. You think the system is complicated for you buying that Violet Crumble at the Aldi? It’s way more complicated for banks to exchange money across borders. CBDCs could be used to make it easier for banks to do cross-border transfers.
In September, central banks in Australia, Singapore, Malaysia and South Africa started testing a system to use CBDCs to make cross-border transactions cheaper and easier between those countries.
And the Bank for International Settlements which handles this issue for fiat currencies is also exploring using CBDCs for cross-border payments with the central banks of China, Hong Kong, Thailand and the UAE.
So we know how they kind of work. And we know some big fancy international banking is testing them. When can I get a wallet?
I mean seriously. Is any country actually doing this for its citizens?
Yes. And it’s not El Salvador. You may have heard that El Salvador adopted Bitcoin as an official currency. That is not a Central Bank Digital Currency because it’s not issued by a central bank. It’s no different than El Salvador saying Canadian Tire Bucks are now they’re official currency. It’s the government giving a currency they are not in control of the official blessing to pay for things with it.
But it’s not a Central Bank Digital Currency.
The Bahamas get the credit for the first Central Bank Digital Currency. The Sand Dollar is the official digital version of the Bahamian dollar, issued by the Central Bank of the Bahamas in collaboration with MasterCard and Island Pay. It was officially deployed in October 2020.
With 700 islands, moving actual cash around the Bahamas is costly and time-consuming. You have to put it on boats and stuff. The hope is that disbursements using the sand dollar will reduce the need to move actual paper notes by boat or otherwise.
5 other Caribbean islands have followed suit, including St. Kitts and Nevis, Antigua and Barbuda, Saint Lucia, and Grenada.
China is also fairly well along in a central bank Digital currency
China is the biggest country that has an active test of a working CBDC, the digital RMB for domestic use and digital Yuan for international use. It launched its test programs in 2020.
China’s CBDC exists on a phone or digital card and does not need an active internet connection to make transactions, though it does need internet to access accounts. Some of China’s tests of its digital currency set expiration dates to encourage spending. But they don’t usually do that. And China replaces one unit of physical currency for every digital unit it releases, keeping the money supply the same. China’s CBDCs are issued by the People’s Bank of China to a few private banks for disbursement, keeping banks in the loop.
China has conducted multiple tests of CBDCs in many cities like Shanghai, Shenzhen and Beijing, giving citizens free grants of small amounts to spend in a few participating test locations including Mcdonald’s, Subway, and Starbucks. The big test is expected to happen at the 2022 Winter Olympics in Beijing which will feature a pilot program with a wide national and international footprint for the first time.
And that’s about it. While around 90% of government central banks are investigating or developing digital currencies, there aren’t many who have launched them.
For instance MIT and the Boston Fed are undertaking Project Hamilton to research and test a FedCoin for the US. The Bank of England has created a CBDC task force and the EU launched a two-year investigation into a digital Euro project in July 2021. But none of them are coming anytime soon.
So there you have it. Central Bank Digital Currencies are something central banks around the world would someday like to issue as an alternative to paper notes and coins, that you could hold and spend in digital cards or phone apps for easy efficient spending and saving.
In other words I hope now you know a little more about CBDCs.

Apple Places Wistron on Probation – DTH

DTH-6-150x150Apple places Wistron on probation after an internal audit found the company violated its Supplier Code of Conduct, Microsoft is looking to develop its own ARM chips, and the operators of the supply-chain attack against SolarWinds had access since at least October 2019.

MP3

Please SUBSCRIBE HERE.

You can support Daily Tech Headlines directly here.

A special thanks to all our supporters–without you, none of this would be possible.

Big thanks to Dan Lueders for the theme music.

Big thanks to Mustafa A. from thepolarcat.com for the logo!

Thanks to our mods, Kylde, Jack_Shid, KAPT_Kipper, and scottierowland on the subreddit

Send us email to feedback@dailytechnewsshow.com

Show Notes
To read the show notes in a separate page click here!

About EU-US Privacy Shield

KALM-150x150"

Tom dives into the complex history and potential implications of EU-US Privacy Shield.

Featuring Tom Merritt.

MP3

Please SUBSCRIBE HERE.

A special thanks to all our supporters–without you, none of this would be possible.

Thanks to Kevin MacLeod of Incompetech.com for the theme music.

Thanks to Garrett Weinzierl for the logo!

Thanks to our mods, Kylde, Jack_Shid, KAPT_Kipper, and scottierowland on the subreddit

Send us email to feedback@dailytechnewsshow.com

Episode Script
I heard Europe declared sending data to the US illegal
Does that mean I can’t send email to my friends?
Is Facebook going away in Europe?
Are you confused?
Don’t be.
Let’s help you Know a Little more about the EU-US Privacy Shield

On July 6, 2020 the Court of Justice of the European Union or CJEU- invalidated an agreement between the EU and US that let data to be transferred easily between the two countries.
The agreement known as Privacy Shield was the most recent attempt to harmonize the privacy laws of the two governments.
So why do we even need this and why do I care?
You may care if you use any kind of cloud service in the US from Microsoft Azure to Facebook, because without easy data transfer your costs could go up or services might go away.
Here’s why you need an agreement at all.
With a few limited exceptions, it is illegal for a company to send personal information about EU residents to a place that doesn’t offer equivalent privacy protections to what the EU provides.
This does NOT apply to “necessary” data transfer, like sending an email from the EU to the US or booking travel on a US website from the EU or vice versa. This law applies to data that could be stored anywhere.
About 5,000 companies move personal data between the US and EU for their own internal reasons. Maybe they get a better deal on servers in one location so they store all the data there. Maybe it makes some particular operation more efficient.
Cross-border activities include cloud services, Human Resources, marketing et cetera.
Of course companies could just keep EU data in the EU and US data in the US but that means redundant systems, complex programming to make sure data gets routed to the right region, difficulties when you need to look at all your data in aggregate across the two regions and more. It costs more money to keep the two region’s data completely separate.
Here’s a really oversimplified example. I run a website called dailytechnewsshow.com and I turn on comments. My web server is in the US. without an agreement, when somebody makes a comment from Europe, I need to store that comment in the EU. Which means I now have to have a separate server just for EU comments, doubling my costs. Also when I run analytics on my visitors to see how many people are visiting and what they click on I have to run it separately on the wto servers then aggregate the data the end to see results. This means my stock analytics program might need to be reprogrammed.
Yes I can already hear the objections that this isn’t really how this works but it gives you a metaphor. Take those problems and multiple them by $5 trillion and corporate cloud services and you can sort of wrap your head around the problem.
Now, This is only an issue for companies who operate across both the EU and US. If all your personal data is in the EU, you just keep all your data there, problem solved.
But if you have customers or employees in both regions you need to respect both region’s laws. Privacy Shield makes doing that simple.
Without an overall agreement, each of approximately 5,000 companies that handle data across the US and EU, have to negotiate their own Standard Contract Clauses or SCCs– or use a similar mechanism called Binding Corporate Rules or BCR. For simplicity from now on we’ll just refer to SCCs since the issues with both are similar.
Of course the other option is to stop bringing any personal data from the EU to the US. which as we said, costs money, time and resources.
We used to have a solution for this.
From 2000 to 2013 everything seemed like it worked fine under an agreement called Safe Harbor.
Companies sending EU citizens’ data to the US opted into EU-style privacy rules, enforced by the US government.
The other option was the aforementioned SCC that uses preapproved EU contract language to essentially achieve the same thing.
Big companies with enough legal expertise usually implemented SCCs as a backstop but all companies were covered under the Safe Harbor rule even if they didn’t want to mess with SCCs.
Then Edward Snowden came along and leaked documents showing that the US NSA had access to bulk collection of data from people who were not citizens of the US.
Austrian Max Schrems challenged the Safe Harbor agreement, arguing that the NSA access was allowed under Safe Harbor and therefore the Safe Harbor agreement was in conflict with EU law which didn’t allow this kind of surveillance.
The Court of Justice of the European Union agreed with Schrems and ruled that Safe Harbor did not properly protect EU data.
The court identified two main problems. The process of bulk access by US intelligence services was the first..
In the decision striking down Safe Harbor, the European court wrote “…access on a generalised basis to the content of electronic communications must be regarded as compromising the essence of the fundamental right to respect for private life, as guaranteed by Article 7 of the Charter.”
The second problem was the inability of EU citizens to seek redress in the US over this access which the court determined interfered with the right of EU citizens to an effective remedy,.
In the wake of this invalidation, EU lawmakers and the US Department of Commerce worked together to create Privacy Shield, a new framework that addressed the court’s concerns. The US Office of the Director of National Intelligence made written assurance that any access of public authorities for national security purposes will be subject to clear limitations, safeguards and oversight mechanisms. Six situations were defined for when the NSA could use bulk collected data. This was meant to show the access to bulk data was not general.
As for the right to complain in the US, Privacy Shield also created an independent Ombudsperson who could hear complaints from Europeans about how their data had been handled by the NSA. The US put this person in the State Department, separate from national security services.
Complaints had to be resolved in 45 days and national data protection authorities would work with the US FTC and Commerce Department to resolve all complaints.
And the US enacted a law giving EU citizens access to US courts to enforce privacy rights in relation to personal data transferred to the US for law enforcement purposes.
In practice, each company would have to certify its privacy policies were in line with Privacy Shield each year and the US department of Commerce would verify this.
It was adopted in 2016 but immediately faced criticism. Particularly people felt that privacy Shield would not be compatible with the General Data Protection Regulation or GDPR, which had just been passed in 2016 and would go into effect in 2018.
Max Schrems brought another case against privacy Shield– commonly called Schrems II– and won again
So Why Did Privacy Shield Fail?
The European Court said “The limitations on the protection of personal data… are not circumscribed in a way that satisfies requirements that are essentially equivalent to those required under EU law.”
In other words, limiting bulk data access to six cases did not give Europeans equivalent privacy protection to what they got in Europe.
But the court didn’t invalidate the SCCs instead saying that companies must decide whether the laws of the countries where they are sending data offered adequate protection under EU law. Most companies took that to mean they could keep using SCCs and continue operations as normal.
And on cue a Max Schrems appears. Well this time, Noyb, a group founded by Max Schrems filed complaints against 101 European websites arguing they stop sending data to US-based tech providers– even under SCCs– because the US doesn’t provide adequate protection for Europeans against surveillance.
These complaints are lodged with Europe’s individual country’s data protection agencies based on where the companies running the websites are headquartered.
Facebook’s European operations happen to be headquartered in Ireland.
Guess what happened next?
In September, 2020, Ireland’s Data Protection Commission issued a preliminary order for Facebook to suspend EU user data transfers to the US. Ireland said the SCC was not sufficient. The order has not been finalized and could change. Facebook had a chance to respond and a new draft was sent to the 26 privacy regulators in the EU for joint approval.
Facebook is taking this seriously.
In a filing, Facebook wrote “In the event that [Facebook] were subject to a complete suspension of the transfer of users’ data to the US, as appears to be what the DPC proposes, it is not clear to [Facebook] how, in those circumstances, it could continue to provide the Facebook and Instagram services in the EU.”
You take away our SCC protection, we take away Instagram!
Would they? Well that might be costlier than adapting to keeping data separate but they certainly are worried about it. Facebook can also tie this up in the courts for awhile to buy time.
As to those SCCs, there may be hope for Facebook and others on that front as well. The EU had been working on updating SCCs to account for GDPR. That work paused t while the Privacy Shield case was going on– you know to see what would happen– but the work has now resumed.
And the EU and US are still working on a new agreement. The problem is the two major court decisions seem to leave the US no choice but to change its surveillance laws in respect to Europeans.
But the US says its surveillance practices are proportionate. The US Commerce Department believes amendments to US laws passed since 2016 increase protections and mitigate many of the concerns in the Privacy Shield case.
A white paper issued by the department of commerce in October 2020 essentially says the US thinks the court got it wrong and doesn’t plan to compromise.
It asserts the US collects no more data than Europe. Collection of security data by EU member states is beyond review of the CJEU, and the EU should want the US to collect data to bolster security since it shares intelligence with the EU. This won’t change the Privacy Shield decision, but it may help bolster cases surrounding SCCs.
So here’s where we stand
The Court of Justice of the European Union says the status quo isn’t sufficient.
The US says it is.
The US has given some solid rationale for using SCCs.
European Data protection authorities seem to be going after SCCs.
And thousands of companies, millions of people and billions of dollars in trade hang int he balance.
In other words I hope now you know a little more about the EU-US Privacy shield.

Why Netflix’s Corporate Shakeup Matters To You

While 2020 has been known for many things, one thing it will forever be remembered as, in the world of entertainment, is the year every media conglomerate opted to shake things up — and the latest of the bunch is none other than everyone’s favorite binge streamer, Netflix.

Netflix logoAs reported by THR yesterday, the big N, following the loss of Channing Dungey as the SVP of Original Content earlier this month, has decided to change some things within its corporate structure.

Up until now, Netflix siloed its content machine by genre, and then into different budget levels and territories within those genres. One exec would handle high-budget dramas while another handled low-budget sci-fi/fantasy and another took charge of Canadian imports as another managed the Irish mysteries. As you can imagine, this makes for a hell of a strange development process, which Netflix has finally realized. Starting now, the company will instead pursue a more traditional TV content model broken out by drama, comedy, event series, unscripted series and overall deals (the Shondas and Ryan Murphys of the world get to do whatever they want as long as they do it for Netflix).

Sound boring? Understandable. But, this is a pretty big deal for a company that, essentially, hates the idea of doing things the traditional way.

Netflix is a rebel. A disrupter. A– whatever buzzword you want to use for a company that thinks a way of doing things is broken simply because it’s old. But sometimes, a system is unchanged for years and years because it works. Restructuring the company into a more traditional model that lacks these “silos” will give Netflix’s creators a better understanding of just who they need to talk to for their project and help streamline the company’s content development process.

Now, instead of seeking out the head of YA/Family event shows based on New York Times bestsellers, you can just pitch to the head of drama.

More important, however, is what this move could signal for Netflix’s future. Is the company finally coming around to the fact that stacking your slate with a cavalcade of expensive loss-leader programs, while flashy, doesn’t actually produce a profit? Possibly. Could it signal Netflix is now going to be more willing to make actual television and not just ten-hour movies? Again, maybe. Could they be launching a free-tier as the idea of AVOD content continues to gain traction within the mainstream? WHY NOT?!

None of this is meant to imply we know what Netflix is going to do next but we do know one thing. Netflix is changing. Arguably, for the better. A media company cannot survive by pouring tens of millions of dollars into flashy genre shows, backed by movie stars, that get canceled after a single season.

Television is a medium fueled by shows that consistently draw audience. The medical shows. The procedurals. The multi-cam sitcoms. These are the things that make television profitable. You can only make these things when you create an environment that invites them in and supports the idea that being a consistent draw is more important than getting a headline in Variety or winning an Emmy.

Netflix has been long overdue for this kind of change and we should welcome it because, as the streaming trendsetter, what’s good for Netflix is good for everyone else. And, if everyone else goes this way, we may finally get some consistently reliable shows of these companies.

Apple Reportedly to Launch Subscription Service Bundles – DTH

DTH-6-150x150Bloomberg’s Mark Gurman reports that Apple will launch subscription service bundles in October, ByteDance and Reliance Platforms in talks about a TikTok investment in India, and Square tests short-term loans in the Cash app.

MP3

Please SUBSCRIBE HERE.

You can support Daily Tech Headlines directly here.

A special thanks to all our supporters–without you, none of this would be possible.

Big thanks to Dan Lueders for the theme music.

Big thanks to Mustafa A. from thepolarcat.com for the logo!

Thanks to our mods, Kylde, Jack_Shid, KAPT_Kipper, and scottierowland on the subreddit

Send us email to feedback@dailytechnewsshow.com

Show Notes
To read the show notes in a separate page click here!

Today in Tech History – – August 14, 2018

1888 – Mr. George Gouraud introduced the Edison phonograph to London in a press conference, including the playing of a piano and cornet recording of Sullivan’s “The Lost Chord,” one of the first recordings of music ever made.

https://www.gsarchive.net/sullivan/html/historic.html

1894 – The first wireless transmission of information using Morse code was demonstrated by Oliver Lodge during a meeting of the British Association at Oxford. A message was transmitted about 50 meters from the old Clarendon Laboratory to the lecture theater of the University Museum.

https://books.google.com/books?isbn=0262313421

1940 – John Atanasoff finished a paper describing the Atanasoff Berry Computer, or ABC, the computer he designed with Clifford Berry to solve simultaneous linear equations.

http://www.computerhistory.org/tdih/august/14/

1989 – Sega launched the Genesis console in the US. It had been released in Japan the previous October as the ‘Mega Drive.’

http://www.usgamer.net/articles/the-true-16-bit-experience-segas-genesis-turns-25

Read Tom’s science fiction and other fiction books at Merritt’s Books site.

Cordkillers 229 – So Darn Comfortable Con

Hot new SDCC trailers, Shonda Rhimes on Netflix, and broadcasting alerts on Spotify? All this and more on Cordkillers! 

Download audio

Download video

CordKillers: Ep. 229 – So Darn Comfortable Con
Recorded: July 23 2018
Guest: None

Intro Video

Primary Target

How to Watch

  • DC’s streaming service will be a one-stop shop for its TV shows, movies, and comics
    – DC announced at San Diego Comic-Con, DC that its streaming service DC Universe will launch this fall as a hub for all things DC, with content, comics, an encyclopedia, and a social platform for fans. DC Universe will cost $7.99 a month or $74.99 for an annual subscription. Subscribers who preorder will get an additional three months for free. DC will bring five original shows to the platform in conjunction with Warner Brothers, in addition to existing live-action and animated works.

What to Watch

  • Buffy the Vampire Slayer Is Getting Rebooted, With an Emphasis on Diversity
    – Joss Whedon to executive producer a reboot of Buffy, The Vampire Slayer with a new young diverse cast.
    Monica Owusu-Breen, whose previous writing credits include Alias, Charmed, and Agents of SHIELD will write.
  • Netflix and Shonda Rhimes reveal eight exclusive series in the works
    – Netflix announced 8 shows in development with Shonda Rhimes.
    – Alleged con artist Anna Delvey
    – Adaptation of the 2010 book The Warmth of Other Suns detailing the flight of African-Americans north from 1916-1970
    – Adaptation of Kleiner Perkins’ Ellen Pao’s memoir Reset: My Fight for Inclusion and Lasting Change
    – Based on Julia Quinn’s Regency England feminist romance series.
    – Pick & Sepulveda set in Mexican California in the 1840as.
    – Adaptation of The Residence: Inside the Private World of the White House.
    – Sunshine Scouts – half hour comedy series about teenage girls at a sleepaway camp who survive the apocalypse.
    – Hot Chocolate Nutcracker documentary of Debbie Allen Dance Academy’s award-winning reimagining of the classic ballet.
  • Netflix announces its first Mark Millar titles
    – Netflix announced the first titles from Millarworld:
    Jupiter’s Legacy:
    An original series about Golden Age superheroes having kids…and those kids becoming angsty millennials.
    Empress:
    An original film about a space Empress on the run.
    Huck:
    This movie wonders if the greatest super power is just all the friends we made along the way.
    Sharkey:
    Adapted from an upcoming comic, a film about a bounty hunter. In space. Named Sharkey.
    American Jesus:
    A comic-turned-Spanish-language TV show about a boy who may or may not be the second coming of Jesus.
  • Amazon Orders Sci-Fi Series ‘Tales From the Loop’
    – Amazon has given a series order to “Tales From the Loop,” a science fiction drama from “Legion” writer Nathaniel Halpern, based on the art of Simon Stålenhag, whose paintings blend elements of futuristic science fiction with images of rural life in the Sweden.
  • Hulu’s Mars drama ‘The First’ debuts September 14th
    – Hulu’s “The First” starring Natascha McElhone and Sean Penn and developed by Beau Willimon, premiers September 14. It follows the first human Mars mission.
  • The Spider-Man: Into the Spider-Verse Comic-Con Footage Was Absolutely Amazing
    – Sony showed but didn’t release a trailer for Spider-Man: Into the Spider-Verse which hits theaters December 18. (Features Miles Morales, Gwent Stacy, Peter Parker AND more sipdeys from other universes like Spider-Ham (voiced by John Mulaney), Peni Parker (Kimiko Glenn), and Spider-Man Noir)

What We’re Watching

Front Lines

  • Disney fires ‘Guardians of the Galaxy’ director James Gunn over ‘indefensible’ old tweets
    – Disney cut ties with James Gunn and he will not be directing Guardians of the Galaxy 3. A series of old tweets from Gunn referencing pedophilia and rape resurfaced online this week. Others tweets, which have since been deleted, included satire about 9/11, AIDS and the Holocaust. Thursday Gunn wrote, “Many people who have followed my career know when I started, I viewed myself as a provocateur, making movies and telling jokes that were outrageous and taboo. As I have discussed publicly many times, as I’ve developed as a person, so has my work and my humor.”
  • Senate wants emergency alerts to go out through Netflix, Spotify, etc
    – Senators in Hawaii and South Dakota introduced a bill (the “Reliable Emergency Alert Distribution Improvement,” or READI, act) that would “explore” broadcasting alerts to “online streaming services, such as Netflix and Spotify.”
  • Survey: 5.4 Million Americans Will Cut The Cable TV Cord In 2018
    – Management Consulting firm cg42 is the latest to put put a study that says cord-cutting is on the rise. cg42 projects 5.4 million more people will cut the cord in 2018 in the US for a total of 18.8 million cord-cutters. The study surveyed customers and cites frustrations with lack of reasonable rates, getting nickled and dimed with fees and new customers getting better deals than existing ones.
  • Comcast concedes to Disney in bidding war for Fox assets
    – Comcast withdrew its offer to purchase most of 21st Century Fox, leaving Disney in position to acquire everything except the broadcasting network, Fox News, Fox Businss, FS1, FS2 and the Big Ten Network which will be spun off into their own company. Disney also previously agreed with regulators to sell off the Fox Sports Regional Networks it will acquire as part of the deal. Meanwhile Comcast will focus on acquiring Sky which is 39% owned by Fox.
  • Netflix redesigns its TV interface with new navigation, full-screen trailers
    – Netflix is rolling out a redesign to its TV-based apps over the next few months. A ribbon menu on the left side will now contain Search, My List, and separate sections for Movies and Series as well as a section called New.
  • Walmart is reportedly building a video streaming service to take on Netflix
    – Sources tell The Information that Walmart is considering offering a streaming video service for $8 a month matching Netflix’s cheapest plan and less than Prime Video’s standalone amount. Walmart currently offers free streaming video with ads through it’s Vudu service.

Dispatches from the Front
Hola gents (and lady guest?),
I’d like to thank you guys for a number of show alerts, mostly courtesy of Bryce. Not everything in his wheelhouse is my flavor, but he seems to find serials early that we haven’t seen and enjoy.
If you guys haven’t talked about Letterkenny yet, you should go watch it. The first two short seasons are available on Hulu. It’s Canadian dry humor full of puns and stereotype characters. It doesn’t waste time with backstory we don’t care about and just rapid-fires the funny.
Give it a shot and see what you think.
Keep cutting them cords, fellas,
Dan and Emily

 

 

 

Hello to all – one thing I’ve been thinking a lot is a way to watch shows without having to have a month to month membership with the different services considering that most services allow you to watch their whole catalog. For example I would pay Netflix in Jan and catch up on all the shows during that month, then cancel it. Feb I pay for Hulu and watch the first two seasons of Handsmaid, catcha few othe shows them cancel it. HBO on March, cbs all access in april, etc.
Or what about an AI like you guys tak about that just gives you the algorithm (ala traveling salesman) that computes the best course to take to hit the most shows you want to watch while paying the least per month.

Love to hear your thoughts.

Arturo

 

 

 

Hey Tom, Brian, and guest,

It seems like everyone compares Netflix and HBO to each other, but I don’t think that’s the right approach anymore. It seems that Netflix no longer wants to be HBO; they want to be your entire cable package. They now have original programming that target so many niches they cover most of the major basic cable channels. Because of this, I don’t think Netflix cares about people criticizing their garbage programming – it’s essentially just a channel you’d never tune in to!

Keep killing those cords,
Andy

 

 

 

 

About the listener who wrote in last week about wanting to have in-progress, “themed streams” – I think he is on to something. For horror fans, the streaming service Shudder broadcasts a constant stream of its content on rotation dubbed “Shudder TV“. In fact, you don’t even have to be a subscriber to access it. If you do subscribe however, you can switch between several sub-genre “channels” of Shudder’s content. Because it comes up as soon as you start the app, I’ve found myself getting interested in movies that I haven’t seen which I may not have otherwise chosen. I can always pull up the on-demand version in the app to get the beginning later. Its an easy way to discover new content without searching through titles, summaries and trailers.

Love the show – keep up the great work.
Tim

 

 

 

What I think Netflix (and perhaps other streaming services) needs is an “I feel lucky” button that will just “pick something” that its algorithm “knows” you’ll like based on your watching history, etc. Don’t like what it picks? Hit next and it could pick something else for you.

Or at least that what *I* would want! I’m not keen on dropping in, in the middle of a movie, TV show, etc. That’s one reason I watch everything via streaming, DVR’d, etc.

Later!
Michael

 

 

 

I subscribe to some movie trailer channels on YouTube. I frequently see movies I would LIKE to watch later, but I am not aware of a service that’ll let me “tag” movies that haven’t even hit theaters yet that I’d like to see whenever they’re available (especially on streaming). Do you know of such a service?

Thanks!
Michael
 

Links

2018 Summer Movie Draft
patreon.com/cordkillers