These are very good scans, too. The kind of scans that a person looking to find a long-lost article, verify a hard-to-grab fact, or needs to pass along to others a great image would kill to have. 600 dots per inch, excellent contrast, clarity, and the margins cut just right.
So, I could fill this entry with all the nice covers, but covers are kind of easy, to be frank. You put them face down on the scanner, you do a nice big image, and then touch it up a tad. The cover paper and the printing is always super-quality compared to the rest, so it’ll look good:
But the INSIDE stuff… that’s so much harder. Magazines were often bound in a way that put the images RIGHT against the binding and not every magazine did the proper spacing and all of it is very hard to shove into a scanner and not lose some information. I have a lot of well-meaning scans in my life with a lot of information missing.
But these…. these are primo.
When I stumbled on the Patreon, he had three patrons giving him $10 a month. I’d like it to be $500, or $1000. I want this to be his full-time job.
Reading the patreon page’s description of his process shows he’s taking it quite seriously. Steaming glue, removing staples. I’ve gone on record about the pros and cons of destructive scanning, but game magazines are not rare, just entirely unrepresented in scanned items compared to how many people have these things in their past.
I read something like this:
It is extremely unlikely that I will profit from your pledge any time soon. My scanner alone was over $4,000 and the scanning software was $600. Because I’m working with a high volume of high resolution 600 DPI images I purchased several hard drives including a CalDigit T4 20TB RAID array for $2,000. I have also spent several thousand dollars on the magazines themselves, which become more expensive as they become rarer. This is in addition to the cost of my computer, monitor, and other things which go into the creation of these scans. It may sound like I’m rich but really I’m just motivated, working two jobs and pursuing large projects.
…and all I think about is, this guy is doing so much amazing work that so many thousands could be benefiting from, and they should throw a few bucks at him for his time.
My work consists of carefully removing individual pages from magazines with a heat gun or staple-remover so that the entire page may be scanned. Occasionally I will use a stack paper cutter where appropriate and will not involve loss of page content. I will then scan the pages in my large format ADF scanner into 600 DPI uncompressed TIFFs. From there I either upload 300 DPI JPEGs for others to edit and release on various sites or I will edit them myself and store the 600 DPI versions in backup hard disks. I also take photos of magazines still factory-sealed to document their newsstand appearance. I also rip full ISOs of magazine coverdiscs and make scans of coverdisc sleeves on a color-corrected flatbed scanner and upload those to archive.org as well.
This is the sort of thing I can really get behind.
The Internet Archive is scanning stuff, to be sure, but the focus is on books. Magazines are much, much harder to scan – the book scanners in use are just not as easy to use with something bound like magazines are. The work that Mark is doing is stuff that very few others are doing, and to have canonical scans of the advertisements, writing and materials from magazines that used to populate the shelves is vital.
Some time ago, I’ve given all my collection of donated Game-related magazines to the Museum of Art and Digital Entertainment, because I recognized I couldn’t be scanning them anytime soon, and how difficult it was going to be to scan it. It would take some real major labor I couldn’t personally give.
Well, here it is. He’s been at it for a year. I’d like to see that monthly number jump to $100/month, $500/month, or more. People dropping $5/month towards this Patreon would be doing a lot for this particular body of knowledge.
The previous entry got the attention it needed, and the maintainers of the VLC project connected with both Emularity developers and Emscripten developers and the process has begun.
The best example of where we are is this screenshot:
With the Emularity project, this was something like 2-3 months into the project. In this case, it happened in 3 days.
The reasons it took such a short time were multi-fold. First, the VLC maintainers jumped right into it at full-bore. They’ve had to architect VLC for a variety of wide-ranging platforms including OSX, Windows, Android, and even weirdos like OS/2; to have something aimed at “web” is just another place to go. (They’d also made a few web plugins in the past.) Second, the developers of Emularity and Emscripten were right there to answer the tough questions, the weird little bumps and switchbacks.
Finally, everybody has been super-energetic about it – diving into the idea, without getting hung up on factors or features or what may emerge; the same flexibility that coding gives the world means that the final item will be something that can be refined and improved.
So that’s great news. But after the initial request went into a lot of screens, a wave of demands and questions came along, and I thought I’d answer some of them to the best of my abilities, and also make some observations as well.
When you suggest something somewhat crazy, especially in the programming or development world, there’s a variant amount of response. And if you end up on Hackernews, Reddit, or a number of other high-traffic locations, those reactions fall into some very predictable areas:
This would be great if it happens
This is fundamentally terrible, let me talk about why for 4 paragraphs
You are talking about making a sword. I am a swordmaker. I have many opinions.
My sister was attacked by a C library and I’m going to tell a very long story
Oh man, Jason Scott, this guy
So, quickly on some of these:
It’s understandable some people will want to throw the whole idea under the bus because the idea of the Web Browser playing a major part in transactions is a theoretical hellscape compared to an ideal infrastructure, but that’s what we have and here we go.
Browsers do some of this heavy lifting. It depends on the browser on the platform on the day and they do not talk. If there was a way to include a framework to tell a browser what to do with ‘stuff’ and then it brought both the stuff and the instructions in and did the work, great. Yes, there’s plenty of cases of stuff/instructions (Webpage/HTML, Audio/MP3) that browsers take in, but it’s different everywhere.
But let’s shift over to why I think this is important, and why I chose VLC to interact with.
First, VLC is one of those things that people love, or people wish there was something better than, but VLC is what we have. It’s flexible, it’s been well-maintained, and it has been singularly focused. For a very long time, the goal of the project has been aimed at turning both static files AND streams into something you can see on your machine. And the machine you can see it on is pretty much every machine capable of making audio and video work.
Fundamentally, VLC is a bucket that, when dropped into with a very large variance of sound-oriented or visual-oriented files and containers, will do something with them. DVD ISO files become playable DVDs, including all the features of said DVDs. VCDs become craptastic but playable DVDs. MP3, FLAC, MIDI, all of them fall into VLC and start becoming scrubbing-ready sound experiences. There are quibbles here and there about accuracy of reproduction (especially with older MOD-like formats like S3M or .XM) but these are code, and fixable in code. That VLC doesn’t immediately barf on the rug with the amount of crapola that can be thrown at it is enormous.
And completing this thought, by choosing something like VLC, with its top-down open source condition and universal approach, the “closing of the loop” from VLC being available in all browsers instantly will ideally cause people to find the time to improve and add formats that otherwise wouldn’t experience such advocacy. Images into Apple II floppy disk image? Oscilloscope captures? Morse code evaluation? Slow Scan Television? If those items have a future, it’s probably in VLC and it’s much more likely if the web uses a VLC that just appears in the browser, no fuss or muss.
Fundamentally, I think my personal motivations are pretty transparent and clear. I help oversee a petabytes-big pile of data at the Internet Archive. A lot of it is very accessible; even more of it is not, or has to have clever “derivations” pulled out of it for access. You can listen to .FLACs that have been uploaded, for example, because we derive (noted) mp3 versions that go through the web easier. Same for the MPG files that become .mp4s and so on, and so on. A VLC that (optionally) can play off the originals, or which can access formats that currently sit as huge lumps in our archives, will be a fundamental world changer.
Imagine playing DVDs right there, in the browser. Or really old computer formats. Or doing a bunch of simple operations to incoming video and audio to improve it without having to make a pile of slight variations of the originals to stream. VLC.js will do this and do it very well. The millions of files that are currently without any status in the archive will join the millions that do have easy playability. Old or obscure ideas will rejoin the conversation. Forgotten aspects will return. And VLC itself, faced with such a large test sample, will get better at replaying these items in the process.
This is why this is being done. This is why I believe in it so strongly.
I don’t know what roadblocks or technical decisions the team has ahead of it, but they’re working very hard at it, and some sort of prototype seems imminent. The world with this happening will change slightly when it starts working. But as it refines, and as these secondary aspects begin, it will change even more. VLC will change. Maybe even browsers will change.
Access drives preservation. And that’s what’s driving this.
See you on the noisy and image-filled other side.
Permanent link to this article: https://www.internetking.us/wordpress/2016/11/17/a-simple-explanation-vlc-js/
I mean, it cost a dozen people hundreds of hours of their lives…. and there were tears, rage, crisis, drama, and broken hearts and feelings… but it did happen, and the elation and the world we live in now is quite amazing, with instantaneous emulated programs in the browser. And it’s gotten boring for people who know about it, except when they haven’t heard about it until now.
By the way: work continues earnestly on what was called JSMESS and is now called The Emularity. We’re doing experiments with putting it in WebAssembly and refining a bunch of UI concerns and generally making it better, faster, cooler with each iteration. Get involved – come to #jsmess on EFNet or contact me with questions.
In celebration of the five years, I’d like to suggest a new project, one of several candidates I’ve weighed but which I think has the best combination of effort to absolute game-changer in the world.
Hey, come back!
A quick glance at the features list of VLC shows how many variant formats it handles, from audio and sound files through to encapsulations like DVD and VCDs. Files that now rest as hunks of ISOs and .ZIP files that could be turned into living, participatory parts of the online conversation. Also, formats like .MOD and .XM (trust me) would live again effectively.
Also, VLC has weathered years and years of existence, and the additional use case for it would help people contribute to it, much like there’s been some improvements in MAME/MESS over time as folks who normally didn’t dip in there added suggestions or feedback to make the project better in pretty obscure realms.
I firmly believe that this project, fundamentally, would change the relationship of audio/video to the web.
I’ll write more about this in coming months, I’m sure, but if you’re interested, stop by #vlcjs on EFnet, or ping me on twitter at @textfiles, or write to me at email@example.com with your thoughts and feedback.
Permanent link to this article: https://www.internetking.us/wordpress/2016/11/01/a-simple-request-vlc-js/
In 2009, Josh Miller was walking through the Timonium Hamboree and Computer Festival in Baltimore, Maryland. Among the booths of equipment, sales, and demonstrations, he found a vendor was selling an old collection of 3.5″ floppy disks for DOS and Windows. He bought it, and kept it.
A few years later, he asked me if I wanted them, and I said sure, and he mailed them to me. They fell into the everlasting Project Pile, and waited for my focus and attention.
They looked like this:
I was particularly interested in the floppies that appeared to be someone’s compilation of DOS and Windows programs in the most straightforward form possible – custom laser-printed directories on the labels, and no obvious theme as to why this shareware existed on them. They looked like this, separated out:
There were other floppies in the collection, as well:
They’d sat around for a few years while I worked on other things, but the time finally came this week to spend some effort to extract data.
There’s debates on how to do this that are both boring and infuriating, and I’ve ended friendships over them, so let me just say that I used a USB 3.5″ floppy drive (still available for cheap on Amazon; please take advantage of that) and a program called WinImage that will pull out a disk image in the form of a .ima file from the floppy drive. Yes, I could do a flux imaging of these disks, but sorry, that’s incredibly insane overkill. These disks contain files put on there by a person and we want those files, along with the accurate creation dates and the filenames and contents. WinImage does it.
Sometimes, the floppies have some errors and require trying over to get the data off them. Sometimes it takes a LOT of tries. If after a mass of tries I am unable to do a full disk scan into a disk image, I try just mounting it as A: in Windows and pulling the files off – they sometimes are just fine but other parts of the disk are dead. I make this a .ZIP file instead of a .IMA file. This is not preferred, but the data gets off in some form.
Some of them (just a handful) were not even up for this – they’re sitting in a small plastic bin and I’ll try some other methods in the future. The ratio of Imaged-ZIPed-Dead were very good, like 40-3-3.
I dumped most of the imaged files (along with the ZIPs) into this item.
This is a useful item if you, yourself, want to download about 100 disk image files and “do stuff” with them. My estimation is that all of you can be transported from the first floor to the penthouse of a skyscraper with 4 elevator trips. Maybe 3. But there you go, folks. They’re dropped there and waiting for you. Internet Archive even has a link that means “give me everything at once“. It’s actually not that big at all, of course – about 260 megabytes, less than half of a standard CD-ROM.
I could do this all day. It’s really easy. It’s also something most people could do, and I would hope that people sitting on top of 3.5” floppies from DOS or Windows machines would be up for paying the money for that cheap USB drive and something like WinImage and keep making disk images of these, labeling them as best they can.
I think we can do better, though.
The Archive is running the Emularity, which includes a way to run EM-DOSBOX, which can not only play DOS programs but even play Windows 3.11 programs as well.
Therefore, it’s potentially possible for many of these programs, especially ones particularly suited as stand-alone “applications”, to be turned into in-your-browser experiences to try them out. As long as you’re willing to go through them and get them prepped for emulation.
Which I did.
The Festival Floppies collection is over 500 programs pulled from these floppies that were imaged earlier this week. The only thing they have in common was that they were sitting in a box on a vendor table in Baltimore in 2009, and I thought in a glance they might run and possibly be worth trying out. After I thought this (using a script to present them for consideration), the script did all the work of extracting the files off the original floppy images, putting the programs into an Internet Archive item, and then running a “screen shotgun” I devised with a lot of help a few years back that plays the emulations, takes the “good shots” and makes them part of a slideshow so you can get a rough idea of what you’re looking at.
You either like the DOS/Windows aesthetic, or you do not. I can’t really argue with you over whatever direction you go – it’s both ugly and brilliant, simple and complex, dated and futuristic. A lot of it depended on the authors and where their sensibilities lay. I will say that once things started moving to Windows, a bunch of things took on a somewhat bland sameness due to the “system calls” for setting up a window, making it clickable, and so on. Sometimes a brave and hearty soul would jazz things up, but they got rarer indeed. On the other hand, we didn’t have 1,000 hobbyist and professional programs re-inventing the wheel, spokes, horse, buggy, stick shift and gumball machine each time, either.
Just browsing over the images, you probably can see cases where someone put real work into the whole endeavor – if they seem to be nicely arranged words, or have a particular flair with the graphics, you might be able to figure which ones have the actual programming flow and be useful as well. Maybe not a direct indicator, but certainly a flag. It depends on how much you want to crate-dig through these things.
Let’s keep going.
Using a “word cloud” script that showed up as part of an open source package, I rewrote it into something I call a “DOS Cloud”. It goes through these archives of shareware, finds all the textfiles in the .ZIP that came along for the ride (think README.TXT, READ.ME, FILEID.DIZ and so on) and then runs to see what the most frequent one and two word phrases are. This ends up being super informative, or not informative at all, but it’s automatic, and I like automatic. Some examples:
Certainly in the last case, those words are much more informative than the name D4W20 (which actually stands for “Destroyer for Windows Version 2.0”), and so the machine won the day. I’ve called this “bored intern” level before and I’d say it’s still true – the intern may be bored, but they never stop doing the process, either. I’m sure there’s some nascent class discussion here, but I’ll say that I don’t entirely think this is work for human beings anymore. It’s just more and more algorithms at this point. Reviews and contextual summaries not discernible from analysis of graphics and text are human work.
I and others will no doubt write more and more complicated methods for extracting or providing metadata for these items, and work I’m doing in other realms goes along with this nicely. At some point, the entries for each program will have a complication and depth that rivals most anything written about the subjects at the time, when they were the state of the art in computing experience. I know that time is coming, and it will be near-automatic (or heavily machine-assisted) and it will allow these legions of nearly-lost programs to live again as easily as a few mouse clicks.
But then what?
But Then What is rapidly becoming the greatest percentage of my consideration and thought, far beyond the relatively tiny hurdles we now face in terms of emulation and presentation. It’s just math now with a lot of what’s left (making things look/work better on phones, speeding up the browser interactions, adding support for disk swapping or printer output or other aspects of what made a computer experience lasting to its original users). Math, while difficult, has a way of outing its problems over time. Energy yields results. Processing yields processing.
No, I want to know what’s going to happen beyond this situation, when the phones and browsers can play old everything pretty accurately, enough that you’d “get it” to any reasonable degree playing around with it.
Where do we go from there? What’s going to happen now? This is where I’m kind of floating these days, and there are ridiculously scant answers. It becomes very “journey of the mind” as you shake the trees and only nuts come out.
To be sure, there’s a sliver of interest in what could be called “old games” or “retrogaming” or “remixes/reissues” and so on. It’s pretty much only games, it’s pretty much roughly 100 titles, and it’s stuff that has seeped enough into pop culture or whose parent companies still make enough bank that a profit motive serves to ensure the “IP” will continue to thrive, in some way.
The Gold Fucking Standard is Nintendo, who have successfully moved into such a radical space of “protecting their IP” that they’ve really successfully started moving into wrecking some of the past – people who make “fan remixes” might be up for debate as to whether they should do something with old Nintendo stuff, but laying out threats for people recording how they experienced the games, and for any recording of the games for any purpose… and sending legal threats at anyone and everyone even slightly referencing their old stuff, as a core function.. well, I’m just saying perhaps ol’ Nintendo isn’t doing itself any favors but on the other hand they can apparently be the most history-distorting dicks in this space quadrant and the new games still have people buy them in boatloads. So let’s just set aside the Gold Fucking Standard for a bit when discussing this situation. Nobody even comes close.
There’s other companies sort of taking this hard-line approach: “Atari”, Sega, Capcom, Blizzard… but again, these are game companies markedly defending specific games that in many cases they end up making money on. In some situations, it’s only one or two games they care about and I’m not entirely convinced they even remember they made some of the others. They certainly don’t issue gameplay video takedowns and on the whole, historic overview of the companies thrives in the world.
But what a small keyhole of software history these games are! There’s entire other experiences related to software that are both available, and perhaps even of interest to someone who never saw this stuff the first time around. But that’s kind of an educated guess on my part. I could be entirely wrong on this. I’d like to find out!
Pursuing this line of thought has sent me hurtling into What are even musuems and what are even public spaces and all sorts of more general questions that I have extracted various answers for and which it turns out are kind of turmoil-y. It also has informed me that nobody kind of completely knows but holy shit do people without managerial authority have ideas about it. Reeling it over to the online experience of this offline debated environment just solves some problems (10,000 people look at something with the same closeness and all the time in the world to regard it) and adds others (roving packs of shitty consultant companies doing rough searches on a pocket list of “protected materials” and then sending out form letters towards anything that even roughly matches it, and calling it a ($800) day).
Luckily, I happen to work for an institution that is big on experiments and giving me a laughably long leash, and so the experiment of instant online emulated computer experience lives in a real way and can allow millions of people (it’s been millions, believe it or not) to instantly experience those digital historical items every second of every day.
So even though I don’t have the answers, at all, I am happy that the unanswered portions of the Big Questions haven’t stopped people from deriving a little joy, a little wonder, a little connection to this realm of human creation.
In September of 2016, a talented programmer released his own cooked update to a major company’s legacy operating system, purely because it needed to be done. A raft of new features, wrap-in programs, and bugfixes were included in this release, which I stress was done as a hobby project.
The project is understatement itself, simply called Prodos 2.4. It updates ProDOS, the last version of which, 2.0.3, was released in 1993.
You can download it, or boot it in an emulator on the webpage, here.
As an update unto itself, this item is a wonder – compatibility has been repaired for the entire Apple II line, from the first Apple II through to the Apple IIgs, as well as cases of various versions of 6502 CPUs (like the 65C02) or cases where newer cards have been installed in the Apple IIs for USB-connected/emulated drives. Important utilities related to disk transfer, disk inspection, and program selection have joined the image. The footprint is smaller, and it runs faster than its predecessor (a wonder in any case of OS upgrades).
The reason I call this the most important operating system update of the year is multi-fold.
First, the pure unique experience of a 23-year-gap between upgradesmeans that you can see a rare example of what happens when a computer environment just sits tight for decades, with many eyes on it and many notes about how the experience can be improved, followed by someone driven enough to go through methodically and implement all those requests. The inclusion of the utilities on the disk means we also have the benefit of all the after-market improvements in functionality that the continuing users of the environment needed, all time-tested, and all wrapped in without disturbing the size of the operating system programs itself. It’s like a gold-star hall of fame of Apple II utilities packed into the OS they were inspired by.
This choreographed waltz of new and old is unique in itself.
Next is that this is an operating system upgrade free of commercial and marketing constraints and drives. Compared with, say, an iOS upgrade that trumpets the addition of a search function or blares out a proud announcement that they broke maps because Google kissed another boy at recess. Or Windows 10, the 1968 Democratic Convention Riot of Operating Systems, which was designed from the ground up to be compatible with a variety of mobile/tablet products that are on the way out, and which were shoved down the throats of current users with a cajoling, insulting methodology with misleading opt-out routes and freakier and freakier fake-countdowns.
The current mainstream OS environment is, frankly, horrifying, and to see a pure note, a trumpet of clear-minded attention to efficiency, functionality and improvement, stands in testament to the fact that it is still possible to achieve this, albeit a smaller, slower-moving target. Either way, it’s an inspiration.
Last of all, this upgrade is a valentine not just to the community who makes use of this platform, but to the ideas of hacker improvement calling back decades before 1993. The amount of people this upgrade benefits is relatively small in the world – the number of folks still using Apple IIs is tiny enough that nearly everybody doing so either knows each other, or knows someone who knows everyone else. It is not a route to fame, or a resume point to get snapped up by a start-up, or a game of one-upsmanship shoddily slapped together to prove a point or drop a “beta” onto the end as a fig leaf against what could best be called a lab experiment gone off in the fridge. It is done for the sake of what it is – a tool that has been polished and made anew, so the near-extinct audience for it works to the best of their ability with a machine that, itself, is thought of as the last mass-marketed computer designed by a single individual.
That’s a very special day indeed, and I doubt the remainder of 2016 will top it, any more than I think the first 9 months have.
Thanks to John Brooks for the inspiration this release provides.
Permanent link to this article: https://www.internetking.us/wordpress/2016/09/15/why-the-apple-ii-prodos-2-4-release-is-the-os-news-of-the-year/
People often ask me if there’s a way they can help. I think I have something.
So, the Internet Archive has had a wild hit on its hand with the Hip Hop Mixtapes collection, which I’ve been culling from multiple sources and then shoving into the Archive’s drives through a series of increasingly complicated scripts. When I run my set of scripts, they do a good job of yanking the latest and greatest from a selection of sources, doing all the cleanup work, verifying the new mixtapes aren’t already in the collection, and then uploading them. From there, the Archive’s processes do the work, and then we have ourselves the latest tapes available to the world.
Since I see some of these tapes get thousands of listens within hours of being added, I know this is something people want. So, it’s a success all around.
With success, of course, comes the two flipside factors: My own interest in seeing the collection improved and expanded, and the complaints from people who know about this subject finding shortcomings in every little thing.
There is a grand complaint that this collection currently focuses on mixtapes from 2000 onwards (and really, 2005 onwards). Guilty. That’s what’s easiest to find. Let’s set that one aside for a moment, as I’ve got several endeavors to improve that.
What I need help with is that there are a mass of mixtapes that quickly fell off the radar in terms of being easily downloadable and I need someone to spend time grabbing them for the collection.
While impressive, the 8,000 tapes up on the archive are actually the ones that were able to be grabbed by scripts, without any hangups, like the tapes falling out of favor or the sites they were offering going down. If you use the global list I have, the total amount of tapes could be as high as 20,000.
Again, it’s a shame that a lot of pre-2000 mixtapes haven’t yet fallen into my lap, but it’s really a shame that mixtapes that existed, were uploaded to the internet, and were readily available just a couple years ago, have faded down into obscurity. I’d like someone (or a coordinated group of someones) help grab those disparate and at-risk mixtapes to get into the collection.
I have information on all these missing tapes – the song titles, the artist information, and even information on mp3 size and what was in the original package. I’ve gone out there and tried to do this work, and I can do it, but it’s not a good use of my time – I have a lot of things I have to do and dedicating my efforts in this particular direction means a lot of other items will suffer.
So I’m reaching out to you. Hit me up at firstname.lastname@example.org and help me build a set of people who are grabbing this body of work before it falls into darkness.
Permanent link to this article: https://www.internetking.us/wordpress/2016/09/07/whos-going-to-be-the-hip-hop-hero/
For over a decade, net neutrality has been a topic of heated discussion in the technology world. Put plainly, net neutrality means that everybody can access the same Internet in the same way, regardless of one’s Internet Service Provider or what they are accessing. Today, Comcast doesn’t really care if a user is accessing a U.S. government site, reading anarchist literature, gaming, watching YouTube or farting around Facebook. Losing net neutrality would mean that service providers would be able to change what can be accessed and how on the Internet.
One of the single most distinctive and important aspects of the Internet is how anybody can do anything on it, even a punk kid living in his mother’s basement eating Hot Pockets can make millions with the right idea. What makes the Internet such an incredible force in today’s world is how accessible it is to anybody of any status. The homeless and destitute can get online at locations all over the place and be equal to everybody else – write essays that are instantly available worldwide, watch lectures on any subject under the sun or play chess against the best of the best. The Internet is a great equalizing factor and the benefit it has provided to humanity will probably never be fully appreciated. On April 23, the FCC announced that it would put forth new rules to let companies pay ISPs for faster data rates, what has been described as a “fast lane” for the Internet. What this means is the richest corporations in the country can make their content more easily
accessed and the Internet ceases to be an equalizer and becomes yet another facet of our corporatist economy. The raw amount of competition on the Internet is amazing. The number of social networks, games, video streaming sites and news outlets are astronomical, and the only way one succeeds is by being a better product.
The end of net neutrality changes everything. When a small business has an idea and Google has an idea, Google has the millions to pay ISPs to make its idea work faster while the small business is considering which brand of instant ramen is cheaper. The Internet is an amazing source of alternative news media. Every day, stories not seen on TV are shared to millions of Americans, things for whatever reason not covered by the major networks. The new FCC rules will allow the major networks to make their sites run faster, gaining more viewers, increasing ad revenue and outcompeting alternative media outlets. Small businesses already have a hard enough time surviving and competing in the economy we have now, with millions of pages of regulations, a skyrocketing minimum wage and millions of dollars of subsidies going to politically-connected corporations. The principle that all content on the Internet was equal gave everybody equal footing and equal opportunity and that era appears to be at an end. Is it just a coincidence that the current head of the FCC, Thomas Wheeler, worked as a lobbyist for cable companies and has been inducted into the Cable Television Hall of Fame? This also gives our government the ability to hinder access to materials it doesn’t agree with, and with its track record, how can it be trusted?
April 20, 2019, Update: Firefighters got an earlier start today with ignitions on the Pingree Hill portion of the Elkhorn-Pingree Hill Prescribed Burn. Expect smoke to be visible in the Poudre Canyon and along Highway 14. Yesterday firefighters successfully burned 150 acres. The Elkhorn - Pingree Hill Prescribed Burn is located in the Poudre Canyon. […]
See the 'Announcements' and 'News' Tabs for the latest information on planned prescribed burns. For detailed information regarding prescribed fires on the Payette, visit our on-line Southwest Idaho Interagency Fuels Treatments Map.The Payette National Forest will be conducting multiple prescribed fires this spring. Depending on weather conditions burns could take place anytime from April to […]
PLEASE USE CAUTION AND SLOW DOWN WHEN DRIVING THROUGH PRESCRIBED BURN AREAS.Work is being carried out in and around parked fire equipment. Public and firefighter safety remain our #1 priorityHazardous Fuel Reduction projects are scheduled this winter/spring on three mountaintops within the Kern River Ranger District on the Sequoia National Forest, as well as various […]
In an effort to reduce the risk from high-severity wildfire, fire managers on the San Juan National Forest are increasing the use of prescribed fire. In 2019, we anticipate treating up to 15,000 acres with prescribed fire. Fire can be effectively and efficiently used to reduce fire hazard, and gain ecological and other management benefits. […]
The U.S. Forest Service will be conducting prescribed burns to maintain, restore or improve early successional habitat, maintain wetlands, restore unique barrens ecosystems, and regenerate oak and hickory. Prescribed burns also reduce fuel loads, thereby lowering the risk of wildfire impacts. Due to the overly moist and cold conditions last fall, many planned burns had […]
As winter conditions settle in across Colorado's Northern Front Range, the Arapaho and Roosevelt National Forests will work to burn slash piles resulting from fuels reduction and hazardous tree removal projects across the area. Hand piles are a result of using chainsaws to thin the forest. Much of the smaller cut material is piled for […]
As warmer and drier weather takes hold in Eastern Oregon, the Wallowa-Whitman National Forest is ready to begin spring prescribed burning to reduce hazardous fuels on up to 10,000 acres of the National Forest. National Forest managers have been successfully conducting prescribed burning operations for fuel reduction for more than 35 years, and in the […]
Crews are continuing to hold, monitor and staff the project. No new ignitions are currently scheduled, but prescribed fire already on the ground continues to consume ground litter, such as duff, leaves and needles, and may continue to send smoke into the air from time to time. 440 acres have been treated so far. This […]
SEDONA, Ariz. - The Red Rock Ranger District fire managers are planning a broadcast burn April 9 to 11 southwest of Stoneman Lake. The cold front forecasted for northern Arizona on April 9 to 11 is estimated to have strong enough winds from the correct direction to fulfill the prescription needed for this burn
This spring Monongahela National Forest officials plan to conduct a prescribed burn on 833 acres of national forest land in the Big Mountain area, west and southwest of Cherry Grove in Pendleton County. The Forest Service burned here in 2018 and will conduct a series of burns in this area for the next several years. […]
This spring Monongahela National Forest officials plan to conduct a prescribed burn on 491 acres of national forest land in the Brushy Mountain project area, near Mapledale in Greenbrier County. Why do we burn? Restore historic fire regimesImprove wildlife habitatEnhance forest structure and age diversityImprove oak regenerationControl tree diseases and insectsReduce hazardous fuel levels How […]
Grants, NM – The National Park Service at El Malpais National Monument and the Bureau of Land Management’s El Malpais National Conservation Area plan to conduct a prescribed fire on the west side of the monument near the junction of County Road 42 and the Big Tubes turn-off. The Rendija prescribed fire is approximately 1,500 […]
This spring Monongahela National Forest officials plan to conduct a prescribed burn on 1,055 acres of national forest land in the Ramshorn project area, about five miles southeast of Green Bank in Pocahontas County. Why do we burn? Reintroducing fire will: Restore historic fire regimesImprove wildlife habitatEnhance forest structure and age diversityImprove oak regenerationControl tree […]
This spring Monongahela National Forest officials plan to conduct a prescribed burn on 411 acres of national forest land in the Middle Mountain project area, south of Huntersville in Pocahontas County. Why do we burn? Reintroducing fire into the forest will: Restore historic fire regimesImprove wildlife habitatEnhance forest structure and age diversityImprove oak regenerationControl tree […]
This spring Monongahela National Forest officials plan to conduct a prescribed burn on 9 acres of national forest land at Cheat Summit Fort, west of Huttonsville in Randolph County. Why do we burn? This 9-acre prescribed burn will help to control grasses and other vegetation, and enhance interpretive opportunities at historic earthworks that date from […]
The rainstorm of Feb. 14, 2019, caused extensive damage to Forest Service roads across the national forest, which covers the San Bernardino and San Jacinto mountains, as well as as the eastern slice of the San Gabriels. Damage was also caused to permitted state routes maintained by Caltrans. An incident response team has been created […]
See the 'Announcements' and 'News' Tabs for the latest information on planned prescribed burns. As temperatures drop and the first precipitation of the season arrives, the Shasta-Trinity National Forest is conducting its fall, winter and spring prescribed fire projects. “The 2018 fire season has provided significant firefighting challenges for the Shasta-Trinity National Forest, our partners […]
Forest Service to Conduct Controlled Burn for Grindstone Mountain in Augusta County Today Location: This 830 acre burn will take place on Grindstone Mountain located in Augusta County near Stokesville, VA. Date and Time: The Forest Service will begin ignitions on March 20, 2019. This controlled burn will be completed over 2-3 days. Firefighters will […]
The Bureau of Land Management’s Gila District and Safford Field Office will black line and prepare the burn area boundaries for the approximately 15,726 acre South Rim prescribed burn in the Aravaipa Canyon area. The work will occur this Wednesday-Friday. Fire crews will monitor the burned area during and after the work to ensure public […]
Prescribed Fire: Promoting Fire-Adapted Communities and Creating Resilient Landscapes McCall, ID – While the Payette National Forest will be conducting multiple prescribed fires this spring, an on-line map is available for the public to see details of all burns throughout Southwest Idaho. For many years, land management agencies have been producing a hard copy of […]
EMERGENCY CLOSURE ORDER OF THE FOREST SUPERVISOR RESTRICTING OCCUPANCY AND USE, TO WIT: HOOSIER NATIONAL FOREST, TELL CITY RANGER DISTRICT, INDIANA Under the authority of the Act of Congress dated June 4, 1897, as amended (16 U.S.C. 551), and pursuant to the Secretary of Agriculture’s Regulations set forth as 36 CFR Part 261, Subpart B […]
The prescribed burn that will be attempted today, weather and conditions dependent, is called Jeffries. This will be on the east side of SR 66 south of Sulphur and across the highway from Oriole Lake Recreation Area. It is in Perry County and will be 745
Dolores, Colo., April 11, 2019 –The Dolores Ranger District, San Juan National Forest is planning to conduct several prescribed burns starting in mid-May. Burning operations will take place over multiple days when weather and fuel conditions are favorable. Both hand and aerial ignition methods may be utilized, and continue throughout the summer and into fall […]
Two prescribed fires are planned for today in Perry County, weather and conditions dependent. West Celina - 452 acres. This burn will affect the Two Lakes Trail System with trail and road closures. Clover Lick Talley - 1,120 acres. This burn will affect the Mogan Ridge Trail System with trail and road closures.
PAGOSA SPRINGS, Colo., April 8, 2019 –The Pagosa Ranger District in San Juan National Forest is planning to conduct several prescribed burns near Pagosa Springs. Burning operations will take place over several days beginning mid-April. Work may continue through June depending on weather and fuel conditions. Burn units are located in the following areas northwest […]
SEDONA, Ariz., April 8, 2019, For Immediate Release — Coconino National Forest firefighters are planning to take advantage of favorable weather conditions this week for prescribed burn operations, beginning as early as Tuesday. This week’s broadcast prescribed burns will target up to 1,800 acres on the Red Rock Ranger District and an additional 300 acres […]
Fire staff successfully burned 667 acres of the Big Mountain prescribed burn yesterday, Thursday, April 4. The National Weather Service has predicted rain for this area today and fire staff are on-site patrolling for hot spots. On another note, three wildfires were reported on the Monongahela yesterday. All were small (less than 15 acres) and […]
Fire staff successfully completed the Brushy Mountain Grouse Management Area prescribed burn yesterday, Thursday, April 4. The National Weather Service has predicted rain for this area today and fire staff are on-site patrolling for hot spots.On another note, three wildfires were reported on the Monongahela yesterday. All were small (less than 15 acres) and contained […]
Blackline operations on the Big Mountain Prescribed Burn were successfully completed yesterday, April 2. Fire staff planned to continue burning today, April 3, but the National Weather Service has issued a Red Flag Warning for critical fire weather conditions through this afternoon due to strong winds, low relative humidity and low fuel moisture. Fire staff […]
Forty-four acres of prescribed burning were safety accomplished yesterday, April 2. Fire staff planned to continue burning today, April 3, but the National Weather Service has issued a Special Weather Statement for increased fire danger through this afternoon due to high winds and low relative humidity values. Winds may diminish by dark and if they […]
There is a prescribed burn planned in Jackson County for Wednesday, April 3rd on the Hoosier National Forest. Implementation of this burn is dependent on weather and current conditions in order to proceed. The Hoosier NF plans to burn 658 acres in Fork Ridge today, Wednesday, April 3rd. The burn is located in Jackson County […]
Some blacklining was accomplished today (April 2). If conditions are favorable, blacklining will continue tomorrow (April 3). However, some weather forecasts predict high winds to develop tomorrow. Fire staff will monitor weather/wind predictions and delay additional blacklining until conditions are favorable. Check back for the latest information and updates.
Several units were successfully burned today (April 2). These units will be monitored overnight. If weather conditions are favorable, burning will continue tomorrow (April 3.) However, some weather forecasts predict high winds to develop tomorrow. Fire staff will monitor weather/wind predictions and delay additional burning until conditions are favorable. Check back for the latest information […]