Language:
switch to room list switch to menu My folders
Go to page: 1 2 [3]
[#] Thu Mar 25 2021 17:16:00 MST from noreply@blogger.com (Donovan Colbert) <>

Subject: Resolving local host names with SAMBA on Amiga with MiamiDX

[Reply] [ReplyQuoted] [Headers] [Print]

 The release of the popular Coffin Amiga OS image for V4, and its subsequent migration to MiSTer - has caused more interest in using SAMBA (SMB) on Amiga to move files from machine to machine. SAMBA on the Amiga has some limitations (the server you are connecting to must be a SMB V1 compliant machine, and modern Windows machines require SMB V2 by default). It is easy to set up your MiSTer as a SMB V1 machine as a file-share for other Amigas or a V4, and many NAS devices support SMB V1 by default or can also be easily configured to do so. 

A reoccurring issue I hear people having is that SAMBA on their FPGA Amiga will not resolve SAMBA host file servers they know are on their network, and know are on the same subnet as their SAMBA capable Amiga. If you're using Coffin and MiamiDX (which you should be doing, Roadshow is priced too high and AmiTCP has its own set of issues,) the fix is pretty simple. Credit for reminding me about this goes to Eric Gustafson. 

In Coffin, MiamiDX tends to run iconized and hidden beneath the task bar. The easiest way to bring it up in my experience is to go to Network on the top task-bar in Coffin and select "Online". 

MiamiDX should appear. In the menu on the left, select "Database"

The Database window will default to "DNSservers" 

Change the pulldown menu from DNSservers to "hosts" and click the Add button


The fields will become un-ghosted. Enter the IP Address for your SMB server, the server name, and an alias. Hit enter and the host entry will be saved. 

With MiamiDX still the active application, go to the Workbench top menu task bar, and select "Settings/save" to save your updated configuration. 

Your SMB host should now resolve by name. 


http://donovancolbert.blogspot.com/2021/03/resolving-local-host-names-with-samba.html

[#] Tue May 18 2021 20:14:00 MST from noreply@blogger.com (Donovan Colbert) <>

Subject: Reddit Conversation on CRT/LCDs and Refresh Frequencies for Retro Gaming

[Reply] [ReplyQuoted] [Headers] [Print]

My gut instinct is to recommend a Sony PVM - basically a studio monitor. They're *expensive* and rare - and there are a lot of true multiscan/multisyncs like those from NEC, Mitsubishi and a handful of others that really have the same basic advantage of being true *syncing* multi-frequency monitors (as opposed to fixed-frequency multi-sync monitors, which... )

Ok. One thing that seems lost in the sands of time - that I'll explain now. Back in the day there were basically 3 kinds of monitors. A Fixed frequency monitor - it can do ONE frequency (sometimes two in the case of some Commodore models - but each frequency was through a different cable) - but only that specific frequency. It was a purpose driven monitor designed for a single computer or a single video card. MGA, CGA, EGA and VGA were all generally single frequency displays. They generally supported one resolution. As SVGA became a standard, first with 1mb and resolutions of 640x480, 800x600 and 1024x768 - and various color depths - Multisync monitors became super desirable. These came in two varieties. Fixed-frequency Multisync monitors - a monitor that could generally autoscan the display frequency and match it within a range of *fixed* frequencies. The other type was a Multiscan monitor - and it could scan the incoming frequency and match it across an analog range.
So in example one... if your video output was in frequency A B F Q and T (I'm making this part up...) and your Multisync supported frequencies A B Q and S...

Modes F and T would be unavailable, and S would be useless, as your video card wasn't outputting at that frequency anyhow.

But if you bought a Multiscan that could do everything between A and Z... not only would it be able to match all those modes... but if you had a mode that was, say, A.1 or S.02 - close to a standard but just a little bit off, the multiscan would adjust to that weird frequency - too.
So of course Multi-scans were way more rare, and FAR more expensive. And of course, the terminology is so close and it hasn't been important in so long - that it is real confusing today to figure out what you want... which is why people go with the Sony PVMs. They're a true multiscan and they have a huge range of frequencies they can detect and adjust to - and so... they're the least hassle if you're going this route.

NOW... with all that said... here is the caveat. I mentioned I have a Sony CPD1302. It is a multisync (not a multiscan) RGB CRT with a 15pin VGA standard input that supports a specific range of common frequencies from I think around 13khz up to around 75khz. It was from the era when 486 computers were popular, and SVGA was big, and 1024x768 by 32 million colors was considered a *very* cutting edge display.

At those resolutions - another CRT factor called "dot pitch" came into play as very important. There is an aperture grill (I think it was called a shadow mask in Sony Trinitron displays). It is a sheet of metal with holes punched in it that the red, green and blue guns in the CRT tube were aligned to shoot through to "fire" the phosphorus on the tube and light it up. Because the holes were round - there were tiny little fine "dead spots" (put four circles next to each other in a grid, and there is a little diamond of dead space between them.) The tinier the holes, the better aligned the 3 guns were firing through those holes, the less dead space, and the crisper the image was.

Kind of like this. That diamond in the middle - that is *part* of where you get your "visible scan line" - the black dead lines on a TV or Monitor. 


So... by the time we got to multifrequency high resolution CRTs - the standard was a .28mm hole (or dot pitch) - and Sony had a .25mm dot pitch (smaller is better here...)

And honestly, that resulted in a display almost as crisp as a modern LCD display with hardly and visible scan lines.

Older systems like the C64, the Amiga, CGA and EGA - were so low resolution by comparison that they could get away with .35, .38, even .48 dot pitch CRTs.

That is what causes "pixel bleed" - the guns weren't real precise, they were poorly aligned, there was lots of dead space in huge visible scan lines and so you could offset pixel colors and *mix* one row of pixels with the one near to it and get a completely different *visible* color, and the blurriness of the whole thing actually made the image look BETTER and less "digital"... and modern LCDs can't do this - but a real high quality CRT with a fine dot pitch won't get the effect quite right, either.

Super fine pitch is GREAT for text. LCD precision is GREAT for text. But the blurry inaccuracy of older monitors allowed digital artists to do tricks that *simply won't work on a crisp display*.
And simulated scan lines don't fix that. It is a physical property of the design. To get the best genuine pixel bleed you actually need a low quality CRT with a big dot pitch.

And TVs back then, way before HD, 480i, 720i/p and all of that... weren't designed to be displaying text at all - even 40 column text at 320x200 or lower resolutions. So... the TV you played your console on had a HUGE dotpitch, super poorly aligned guns, and giant visible scan lines when you hooked up your Sega or SNES to it.

One side effect of this was that as we moved to HD - we started noticing things like the makeup covering the zits on a newscaster or other flaws that old TVs *masked*. Sets looked cheaper, you could tell it was peeling wallpaper, not real bricks in Monica's home on Friends. The blurry covered up a lot of things and allowed anyone using CRTs as a medium to manipulate the inherent flaws of the technology to do things that weren't possible when those flaws were "fixed".
So... the bottom line is - the the *most* accurate display you're going to get is by matching genuine hardware with a genuine monitor that was appropriately matched at the time when that hardware was "new". Your FPGA SNES will probably never look quite right to you unless you can figure out how to output it to a woodgrain 27" RCA console TV like you had in your living room in 1990 when you were 11. Even on the Sony studio monitor.

I run MiSTer and MiST on my CPD1302 for Minimig - and it is close - but really, it is too crisp. It would look *much* more authentic to me on a Maganvox 1084S with a far less fine dot pitch. The Sony CRT looks MORE like an LCD than it does like a period appropriate CRT for an Amiga 500/2000.

TL:DR - for a purist, A Sony PVM is the closest you're probably going to get - but even that won't be perfect. ;)

*Whew*. I should cut, copy, paste, and save this somewhere so I can post it whenever this question comes up.

Again, I'm not an *expert* on all of this. I'm sure if I got any important details wrong - someone who is will jump in and tell us how wrong I am - because this is the Internet. ;)


http://donovancolbert.blogspot.com/2021/05/reddit-conversation-on-crtlcds-and.html

[#] Sun Jul 18 2021 09:13:00 MST from noreply@blogger.com (Donovan Colbert) <>

Subject: The History of the CFRP Game

[Reply] [ReplyQuoted] [Headers] [Print]

Like many gamers of my generation - my introduction to the world of Advanced Dungeons and Dragons happened at my local GEMCO. The genre of High Fantasy had been "a thing" among the generation immediately previous to mine - embraced by Led Zepplin and an army of stoners scrawling, "Frodo Lives!" graffiti on their college campuses for at least a decade before that, but TSR and GEMCO were about to bring epic High Fantasy into the mainstream consciousness. 

Previously, nothing quite like AD&D existed in the mainstream consciousness. Medieval themes were relegated to princesses in towers and valiant Knights on brave white steeds dispatching dragons with a lance. Thundarr the Barbarian was possibly the closest thing my generation had been exposed to - but this was more of a Gamma World/Planet of the Apes post-apocalyptic genre thing. That was a genre that was huge from the late 60s to the late 70s - at which point the Star Wars Space Opera was replacing it as the favorite franchise trope in media and entertainment. 



As a 9 year old, my original encounter with the 1st generation AD&D Monster Manual was an amazing event. I was an avid reader, and when I opened the pages of this book in the toy section my mind was blown. I didn't even realize it was part of the rules for a game - I just knew I held a magical tome, an encyclopedia of a mind-blowing assortment of monster descriptions with their stats and vivid descriptions. It lit my imagination on fire.  




Around this same time, a new and emerging type of entertainment was starting to explode. The video game, both the "arcade console," and the "home console," (and to a lesser extent, among the very fortunate, the home computer,) were creating the first generation of children spending hours at a time planted in front of a screen blowing their allowance a quarter at a time to experience virtual realities.   

Due largely to limitations in the technology of the time, and the parallel development Star Wars as the Biggest Thing In The World - the vast majority of those games involved Science Fiction themes. You could take control of a spacecraft and avoid asteroids, defend your base or earth from invading aliens, or go where no man had ever gone before all day long. 

What you couldn't do was enter a dark dungeon teeming with orcs and goblins and beholders in search of plunder and treasure. 

The desire for this type game was palpable among my friends and I. Occasionally there would be an arcade title like "Warrior" - a vector graphics game where two knights battled with swords gladiator-style near open pits. For some reason these titles were rare and mechanically unreliable - frequently out-of-order. Much as depicted in Toy Story, the Cowboy was on his way out, and the Space Ranger was on his way in, and somewhere in between, overlooked, was the band of sell-sword mercenaries looking to make a fortune while also becoming legends of the realm by vanquishing some great evil menace (and maybe robbing some graves in the process.) 

The guys writing the games had all toked many bongs while listening to The Misty Mountains and reading Lord of the Rings - but their bosses, the guys in suits who approved projects - they were from a different generation. Their vision of Middle Ages fantasy adventure was firmly this: 

These were not the droids we were looking for

If you ask people the history of computer fantasy adventure games, you'll get a lot of disagreement on what the "first" computer Fantasy adventure/rpg games were. Rogue and Adventure tend to be popular choices. The problem with these earliest attempts is they were almost exclusively played on college mainframe computers by students and faculty that were already a niche group involved in "pen and paper" gaming too. There was no wide, mainstream consumer access to these titles - and for the most part, only those in academia had opportunity and access to experience these games. The games were crude and primitive, and did not have massive commercial appeal. In my mind - they don't count. 

This was what you got, and you were happy with it.


One early obstacle to these type of adventure games was that you really couldn't just pick up the game and play it - learning the rules quickly by trial and error. In a world of games like Space Invaders, Asteroids and Pac Man - adventure games required you to read and comprehend some fairly complex rule sets in order to understand the goal and how to succeed. There were no "in-game" tutorials and you couldn't just hop online and watch a Lets Play or read a walkthru. 

I think the earliest game that brought the framework of what would develop into FRP and Adventure games was actually 1979's Atari 2600 "Superman" cartridge. Significantly, this was a one player game, with a quest completed by collecting items randomly scattered in a number of different "rooms" or "scenes" that depicted the cities and subways of Metropolis. There were enemies and obstacles that could set you back, there were side-quests (saving Lois Lane from Lex Luther), and you could change identities. Fundamentally, Skyrim operates on an engine that isn't tremendously different than the engine Superman introduced in the 70's. 




Superman wins as the earliest attempt to bring a FRP game to video-gaming on a technicality. It is built on the engine for a game that was started before but released after Superman - a game which was arguably the first game on any home console to deliver a very primitive experience of the high fantasy Dungeons and Dragons FRP genre. 

There are 3 castles, gold, white and black, there are a yellow, red and green dragon most often described as giant ducks, a sword that looks like an arrow and your valiant knight is a block. There is a bat that steals things from you (and sometimes carries the dragons around), and keys. As a 10 year old in 1980, I played Atari 2600 Adventure obsessively with my friend. Adventure is also notable as the first game to have a secret Easter Egg included. A magic invisible dot that revealed a hidden room with the programmer's name inside it - snuck by the "suits" at Atari who would not credit the programmers who designed Atari games. Like Superman, it offered hours of play, multiple different difficulty levels. Once mastered, it was like a bicycle - I can still complete level 3 without a reset after 5 or 10 minutes of warming up, over 40 years later. 

Pictured: Green Duck, sword and noble hero

Here we should also mention Atari's Haunted House. Set in a multi-floor Victorian mansion infested with ghosts, winds that blow out the candle which is your only source of light, and other challenges, it was built on a similar game mechanic to both Superman and Adventure. Instead of a square, your avatar had been upgraded to a pair of eyeballs roaming in terror through the dark halls of the mansion, trying to gather and re-assemble the 3 scattered pieces of an urn.  

Possibly the most underappreciated early CFRP game came 2 years later, in 1982 when a company named "Starpath/Arcadia" released a peripheral called the Supercharger for the Atari 2600.   

This peripheral upgraded the 2600 to 6kb (up from 128 bytes) of ram with relatively high resolution graphics. If development of new hardware cycles hadn't been so rapid and the video game crash imminent - it would have been a far more popular peripheral for the Atari 2600. It loaded games from tape-cassettes - something at this point still relegated to home computer systems which were rare and generally incredibly expensive. Commodore was about to blow the doors off this barrier to entry, but that was still a ways off.  

Dragonstomper was arguably the first, most authentic and most epic fantasy role playing video game to arrive to market at that time. It really got the atmosphere right. The game starts in an overhead map featuring castles, temples, lakes, and forests. Monsters inhabit the countryside, often lurking near particular features. Dispatching them gains you treasures and plunder. Items can enhance or decrease your stats (Strength, Dexterity and Hit Points). 

The monster selection was sometimes anticlimactic



The goal is to amass enough wealth, strength and treasures (including papers that allow safe passage across a brdige) to enter a village. There you can stock up on an inventory of supplies and hirelings, at which point you venture on to the third segment, a dragon's cave full of traps eventually leading to an encounter with the titular Dragon you intend to stomp. While Ultima I had been available since 1981 on the Apple II - Richard Garriott's first two games were in many ways far less realized as a CFRP game than Dragonstomper. The price of an Atari 2600 and a Supercharger was a fraction of the cost of a single peripheral on the home PCs of the time. Dragonstomper was a more fully executed and far more accessible early CFRP in the spirit of AD&D than either Ultima I or II. There is no time travel, are no energy weapons, no space-ships to be found in Dragonstomper. But the general design of Dragonstomper shares much with Ultima III, where Lord British found his stride with his seminal Computer FRP franchise.   

The elements all come together in this game in a way that console games wouldn't match until much later when the NES arrived on US soil. By then, the landscape had changed with computer gaming becoming accessible and dominant. The NES FRP titles almost all had a distinctly Japanese anime/kawaii fairy-tale style unlike the more grittier, dramatic and realistic style of US games. Dragonstomper was the closest that the first wave of computer gaming ever got to an affordable, accessible computerized Fantasy Role Playing game, and it would be a long time after until the C-64 made this a reality for people of modest income again.

It is worth noting that around the same time that Dragonstomper was released, Pitfall on the 2600 and Advance Dungeons & Dragons became available for the Intellivision. The Intellivision was always an odd console that Mattel didn't quite know who to market to, and their AD&D is an interesting game, but more like Hunt the Wumpus on the TI99/4A than like Adventure or Dragonstomper. Pitfall is one of the earliest "platformer" games, with running, jumping and climbing from screen to screen to collect items for a score - but it is also an early "adventure" game in that it is a fairly open world, with different rooms, and a clear quest goal that can be completed, as opposed to just playing indefinitely until you ran out of lives. 




Around this time, two other titles showed up in my life. Venture was an arcade game by Exidy. There was a little neighborhood market in my neighborhood that had this arcade cabinet for a short while. When it worked, it was an incredible game. Unfortunately the hardware was notoriously fickle and it spent most of its time out-of-order, before finally the owner replaced it with a more reliable game. Eventually Coleco would license the title for their Colecovision console and I'd get to play the game to my heart's content. In Venture, you had gone from being represented by a square or a dot to being a smiley face holding a bow, chased through the hallways of a dungeon inhabited by giant troll-like monsters to enter rooms of diabolical design inhabited by a variety of dungeon dwelling enemies including skeletons, goblins, snakes and griffons. It was an arcade game with no quest and endlessly increasing difficulty. The goal was to retrieve as much treasure as possible before the game became so difficult it consumed all of your lives. 




The more interesting title on the Colecovision was the Epyx title, "Gateway to Apshai". This dungeon crawler started you off with low stats and lightly equipped. Exploring the maze-like dungeon revealed more and more of its shape as you moved forward, and encounters with monsters gave you treasure, score, and better equipment. At the end of each level, you could upgrade your stats, improving your abilities as the monsters encountered grew more difficult to dispatch. It was a hybrid between the arcade-scorer gameplay mechanics of Venture and the more FRP quest-based approach with persistent elements like an inventory of titles like Dragonstomper. Apshai was a port of a game previously available on home PCs, but its faithful arrival on a home console announced that consoles were beginning to deliver PC style gaming experiences on a far more affordable budget. 




Around this time, just as things were getting good, the video-game crash occurred. It caused major upheaval and disruption to the video game industry, with some analysts predicting that video games were just a fad and had lived out their moment. 

They were wrong, of course. On the other side, the Commodore 64 and Nintendo NES would change the world. Older first generation gamers who had cut their teeth on domestic gaming systems tended to gravitate toward the C=64, which had become nearly as affordable as consoles had been, with software and hardware easily accessible at your local Gemco or Toys R Us instead of difficult to find and intimidating computer specialty stores or mail-order. 

A second generation of younger gamers would move on quickly from Super Mario Bros. to Legend of Zelda. There is a generational divide in gaming there that still exists to this day, with a legion of younger gamers who cut their teeth on d-pads and cute, kid-friendly Nintendo titles. There is cross-over between both generations of gamers - but as a general rule, older gamers prefer grittier, more realistic "hardcore" gaming and later gamers are far more fond of friendly, stylized gaming with an emphasis on cute, cartoon style graphics and themes. 

The affordable cost and department store availability of the Commodore 64 suddenly made more complex ports of older Atari and Apple computer FRP games available to a previously inaccessible market. Though those titles had existed for much longer, very few gamers had access to them because of the prohibitive cost of the systems they ran on. The Ultima series owes much of its status as the first fully realized commercially successful CFRP to the success of the Commodore 64. 

Shortly after the explosion of affordable home computers after the crash, a promising company called Electronic Arts released a seminal title that for the first time reimagined the formula that arguable had been prototyped with Dragonstomper and was made the default for CFRP gaming with the Ultima series. 




1985's Bards Tale I was in some ways more limited in scope than the Ultima series, containing you inside the walls of a single city under siege by evil monsters rather than a world map full of cities to travel to and explore. Inside this city were various dungeons, all leading to a confrontation with the source of the evil infestation - a predictable rogue wizard. But the level of emersion in this 1st person, "pseudo-3D" presented dungeon crawler was unprecedented. Borrowing heavily on the game mechanics of Ultima III (so much so that some versions could import your characters from Ultima III into Bards Tale,) which in turn itself copied AD&D concepts and ideas - Bards Tale had brought the CFRP game genre into the mainstream. The early levels were mercilessly hard. It is my opinion that most gamers who bought the title didn't get past level 3 or 4 without rage-quitting, cheating using an editor, or finding out the trick to grinding on a specific monster encounter until they were high enough level to make fleeting expeditions into the first 2 levels of the first dungeon. It is really a wonder that The Bard's Tale didn't kill the fledgling genre just as it began to mature. The arrival of the unprecedented graphics capabilities of the Amiga was something EA quickly leveraged, and the Amiga version of Bards Tale had no peers in graphics experience at the time. Coupled with an encounter engine that did play a lot like a tabletop version of AD&D, while Bards Tale 1 was an imperfect game - it laid the foundation for a franchise that still does well today. Eventually the SSI AD&D "Gold Box" titled would appear. These would be the first licensed AD&D CFRP titles that faithfully applied the official AD&D rules and allowed character progression from title to title in the series. 

 From 1979 until 1985 may not sound like that long of a time, but for those who wanted to experience the dungeon adventures of the AD&D FRP game but did not have access to a circle of friends smart enough to puzzle out the complex and arcane rules of the pen and paper game, it was an eternity of waiting for an industry to catch up with the tastes of its audience. 

  



http://donovancolbert.blogspot.com/2021/07/the-history-of-cfrp-game.html

[#] Sun Jul 25 2021 21:11:00 MST from noreply@blogger.com (Donovan Colbert) <>

Subject: A Vegan, a Crossfitter and a Redditor... (BMW M4 Grill)

[Reply] [ReplyQuoted] [Headers] [Print]

    ...who hates the new grill on the BMW M4/M3 all walk into a bar. I only know because they told everyone there in the first 5 minutes. 
 

The new BMW kidney grill is controversial. 

And some of the earliest concept images didn't help.


My first BMW was an e36 328i saloon, followed by an e30 325iC.

Each was initially trashed compared to their predecessor. The e30 was thought to be "too big and diluting the brand with an emphasis on luxury to attract yuppies, instead of driving enthusiasts". Of course, the e30 M3 is the benchmark by which every M since has been judged. In America, their huge chrome and black bumpers were also resoundingly trashed - but I became kind of fond of it as a very specific, very Euro BMW design feature. Once they were able to go with plastic bumpers - the E30 looked a lot more like a Nissan Maxima or Toyota.



Then the E36 was trashed for being even MORE of these things. Bigger, heavier, and more homogenous with all the other brands out there. And of course, the e36 M3 is *also* a legend among the BMW M car history.



So... when I hear people complaining on Reddit that all the new BMWs "look like Kias..."

I kind of chuckle.

Maybe BMWs look like other marquees so often, because BMW is the benchmark of design for which most other auto makers strive. Maybe those OTHER cars are always trying to look like what BMW *was* - as BMW moves off in some bold direction that makes their loyal fanbase scream, "what the hell are you doing?!?"

The Kia Stinger is easy to mistake for an M3 *and* a Porsche Panamera depending on the angle you see it at and the color it is painted. You won't mistake a new M4 for *anything* else on the road.





I haven't liked everything BMW has done for the last 20 years. In that time, I owned a Z3 and an X5 - but also an Escalade and a couple of GMC trucks and a Hyundai Sonata hybrid. I watched the e40 M3 struggle against the Audi S5 and first try turbo-charging before popping a V8 into it, only to then go back to a twin turbo and split the 3 series into a the 3 and 4 series with the 3 being the saloons and the 4 being the coupes - then there were 4 series gran coupes that had 4 doors and I was confused as to why that all happened. I watched as the brand went from 3 basic choices, the 3, the 5 and the 7 - in two basic trim levels, an entry level smaller engine and a upscale model with a larger engine - and some M versions - to having everything from a 1 series to a Z and almost every number and letter between, and seemingly all kinds of different trim packages at each level. That seems like the kind of thing that got GMC in trouble and that made BMW very easy for a driving enthusiast to navigate. Kind of like the difference between In-N-Out and 5 Guys. So between about 2009 and 2020 BMW won two out of 6 car buying decisions in my household. And WAY back, before I could afford a BMW - I owned a tuned Datsun 510 as my first car - way before The Fast and the Furious had made ricers a "thing" - "the poor man's 2002". So, I'm a big fan of this kind of driving experience, that kind of balance and handling and performance. I think it is fine to be critical of some of the decisions and directions BMW seems to be headed. They've *always* made controversial decisions though - and frequently - in the long run, those decisions aged well. Even the 318ti has its defenders, today.



Seriously though... if the 3 is the 4 door and the 4 is the 2 door, and you start making a 4 door 4 series...

Shouldn't you just get rid of the 4 series all together, call them all the 3 series again, and offer it in a 2 door, a 4 door, and a convertible - in RWD and AWD - with a single M available in those variations - like you used to do?

It is the most neurotic part of the BMW line up, other than maybe the 2 series being a RWD car that is the closest thing to the soul of the E30/E36 series M3 - BMW has made in the last 20 years or so - but also slapping that series label on a traverse mounted AWD platform shared with the Mini. But I'm cautious about questioning the logic of the new M235i gran coupe - too. It might be exactly the right car for exactly the right market.

They keep saying, "The 4 is really just the coupe version of the M3..."
And I'm over here going, "No one really believes this, though - so is it REALLY? And if so, why not just call it an M3 coupe?"

And honestly... I've never driven a f series 3 or M3, but owning an F83 and F23 - the M4 has grown a bit... It feels significantly *bigger* than the e36 3 and M3.

I owned a 4.6l Mustang for a while in the 90s - and honestly, the M4 feels somewhere between the e36 M3 and the Mustang in driving characteristics. HUGE grabby tires that want to tramline on road ruts - the big wide tires add to any technology steering issues in making the steering feel damp, less responsive and sluggish. You FEEL the mass at speed in corners - like American muscle cars its mass really *wants* to keep going straight when you tell it to turn. There is a lot of understeer. It has so much torque it feels like the back end wants to betray you when you stomp on it - and the power is less linear than the old normally aspirated straight-6 from the e36. I mean... don't get me wrong I'm not saying that an F82/83 is ANYTHING like the driving experience of a 97 4.7l Mustang GT... (The Mustang was the best light-duty truck/tractor I've ever owned, and Ford *owns* that market,) but it moves in that general direction compared to *previous* BMW M cars.

The F23 M235i though - feels very close to my expectations as a former e30/e36 owner - and comparing it to the e36 M3 is *not* unreasonable.



http://donovancolbert.blogspot.com/2021/07/a-vegan-crossfitter-and-redditor-bmw-m4.html

[#] Tue Aug 03 2021 17:40:00 MST from noreply@blogger.com (Donovan Colbert) <>

Subject: BMW M235i Android Auto MMI Upgrade Review

[Reply] [ReplyQuoted] [Headers] [Print]

BMW was late to the Android Auto and iPhone Airplay game - with an uneven roll out of Android Auto starting in late 2016 models. Despite that, some of their flagship models did not get Android Auto until far later. For example, the F80 M4 (F82/F83) did not get Android Auto at all - it wasn't available until the 2021 G80 series was released in this line. 




Some 2016 head units, on some models could be upgraded to add CarPlay and Android - many more could not. If you're in "Club Could Not" you probably already know it by now. In general, the technical details come down to this - in-car Information/Entertainment consoles have become almost as much a central part of your vehicle's digital brain as the ECU that controls the engine and other systems. Car companies aren't really very good at this, either intentionally or because it isn't their core business. It certainly helps sell new models when you simply can't get a factory head unit with the newest features at any price. 


If you can't upgrade to Android Auto on your factory unit - it is generally a matter of BMW going with a less powerful ARM processor and less memory in the factory head-unit in your car. There are solutions that will reflash your factory head unit to add Android Auto capabilities - but there is a reason BMW didn't allow this upgrade. Without enough RAM or processor, you'll get instability and system slow-downs. Modifying your actual factory head-unit with an aftermarket flash of its programming could also lead to other system wide problems that could be difficult to rectify. 

Several companies have stepped in to fill this void with aftermarket solutions. There are two ways you can do this, and both have their plus and minuses. One solution is a replacement for your factory LCD that has a built in Android Carplay system. These tend to be pricey. You're basically pulling out your factory LCD display and replacing it with an Android touchscreen tablet made to fit where the factory LCD was. Your factory head unit hooks into this LCD directly and can pass its video through to the LCD, allowing you to maintain factory information/entertain center functions. I also don't trust off-brand Chinese tablets to be high quality and have my doubts that they're using high quality components in these devices, either. 

A generally less expensive way is to put in an "MMI" (multimedia interface) box. This is a small box that also contains a stand alone ARM processor and RAM memory. It sits between your factory head unit, which remains unmodified, and the factory LCD, much the same way that a piggy-back ECU tune box sits between the factory ECU and the engine. It also can pass the unmodified factory head unit to the LCD, but your Android Auto connects to the MMI itself, which then outputs from the MMI to the LCD. The end net result is very similar in either case. 

The difference is that with a Android Screen update you're replacing part of your factory entertainment system (the LCD) with a new device. With the MMI you're adding a device between the factory head unit and the LCD. To revert, you would simply remove the MMI, and you're back to factory. Generally speaking, if you have the 8" or larger LCD (a BMW equipped with the Navigation package and Nav iDrive controller,) the MMI makes a lot of sense. If you've got the smaller LCD (non-nav equipped BMW) - then the Android LCD upgrade might be a better way to go. 

With that said, neither my 2020 F83 M4 nor my 2016 F23 M235i have Android Auto. The older M235i also has a much older version of iDrive (NBT 4.0). Among the many limitations this causes, I can have incoming texts dictated to me, but I have to pull over to dictate a response. Media functions involving music are also frustrated and dated. Based on this, I decided to take a chance on the less expensive car with the older system and purchased an MMI unit and installed it. 

There are several companies that all offer this solution, at a wide range of prices from below $300 to nearly $800. Looking at their products, they all look pretty much the same. After a lot of research I found out that is because they are all the same. In fact, according to this post at bimmerpost, they are almost all just reselling a product made by a company named "Sunplus". Each company loads their own custom interface onto the hardware, but underneath, they're the exact same box. 

I'm sure Bimmertech would assure you that for $799 you're getting their world class domestic support, a network of local installation techs across the country, and a better, more polished custom interface. 

In fact, until recently, you could frequently download and flash one company's firmware onto another company's MMI unit - because the hardware was the exact same thing. I'm sure Bimmertech wasn't happy with people buying cheaper units from Chinese vendors and then flashing their firmware onto them. Now the MMI units have a vendor specific pin code and trying to cross-flash your MMI with a different vendor's firmware will brick your device. 

I decided to go with Carlinkit - a Chinese company that offers virtually the same MMI for around $270. Shipping via DHL was surprisingly fast and the MMI arrived less than a week later.  

Both companies make the same promises on features. Wired and wireless (with the right phone and Android 11 or higher) Android Auto and Apple Carplay, integration with factory steering controls, mic, and iDrive controller, access to original iDrive interface and features including parking assist and backup camera, wired and wireless screen mirroring, and other features. If there is a difference other than the vendor's firmware, I haven't been able to discover what it is. If anyone knows, please leave a message in the comments. 

The installation does involve prying trim pieces off of your dash in order to access your factory head unit, temporarily removing your head unit, and removing the wiring from your head unit and the LCD to bypass with an additional harness that comes with the MMI. The F20 2 series seems to have an easier dash to do this on than the 3, 4 and other BMW series, but it is still intimidating to pry pieces off your car's dash. In addition, some cars also have a pair of fiber-optic cables that have to be removed from the original harness and snapped into the replacement one provided. I felt pretty comfortable with the idea of disconnecting and reconnecting the various connectors, which are similar to the molex and other PSU connectors found in a desktop PC. I felt less confident about prying pieces of my dash off. 

I have a friend who is a huge car buff and who has a huge garage with A/C at his house. His garage easily fits a McLaren, a drag modified C7 Vette, and a classic Camaro, along with room to spare for a kitchen and entertainment area, with an RV garage adjoining. His Vette is an unrecognizable monster with a parachute on the end that does the 1/4 mile in under 9 seconds - so he has some experience pulling expensive cars apart and putting them back together. I had him help me with the installation. 

It beat my 120 degree F garage in all ways


We watched videos on YouTube describing the installation, and others showing the disassembly of the 235i. It was still a relatively challenging installation - and took us several attempts. We also broke the retaining clips that hold the fiber optics in on both the original harness and the replacement harness. Probably on the 3rd attempt including reassembling and then taking the dash apart, it seemed like we had it right. It took us some fiddling around with the factory BT settings (disabled) and the BT and WiFi settings on the MMI (enabled) and pairing with the phone, but suddenly Android Auto was appearing on my factory LCD. We buttoned it up. Be aware, on an F20 2 series, the space available to get the MMI and cabling back into the dash is snug. It wasn't easy getting it all shoved back in there. 




When I put it in reverse to leave I got an ominous error, "Parking Assist disabled, use caution when reversing". On the drive home, another alert appeared, "Pedestrian Collison and Stability System disabled, drive moderately and contact a service center as soon as possible." That was more concerning. Quickly after appearing, these messages disappeared and I saw no check codes or other alerts on the factory LCD. When I got home, parking assist worked, and I tested reverse and the reverse camera was now working. I turned the car off and went to bed. 

The next morning everything was working fine until about 10 minutes into my commute, when the LCD flickered several times and then went black, with a "No Signal" error message. I parked it, and a few hours went out, rebooted my phone, restarted the car, and it started working again. On the drive home, it died again. Waited overnight, and the next morning it was still displaying "No Signal." My alarm was also going off intermittently. 




I had been in contact with the support from CarLinkKit immediately. They were quick to respond, even at odd hours, and were reassuring and generally helpful, if sometimes vague. There was also a bit of a language barrier. They wanted video and pictures of the issues I was describing - which seemed kind of pointless to me, but I provided what was asked. They suggested the issue seemed certainly related to either the video cable (LDVS) between either the head-unit and the MMI or the MMI and the LCD, or the fiber optic cables, and asked me to double check those connections. 

This is basically what the MMI looks like from any vendor. The Harness cable is often referred to as the CAN cable. My kit also included two pry-tools to help remove dash panels and connectors.

It is also worth noting, in the image above the LVDS cable has two ends on one side. The BMW version on has one end on each side. Additionally, you may need to change the positions of certain dip switches depending on your model of BMW and the LCD display size in your car. 

The USB/Camera cable included has connectors for wiring up aftermarket front and reverse cameras, a USB HOST connector, two RCA input camera connectors, and a 3.5mm IN jack and a 3.5mm Aux male plug. There is no description of what the 3.5mm jack and plug are for - and you do not need to plug the MMI unit into the factory Aux-in port in order for it to work, so I have no idea what function these cables serve. The USB host port allows you to hook your Android phone to the MMI via a USB cable. I understand that an advantage of the MMI host USB connector over the factory USB connector is that it will charge your phone. 

It was headed into the weekend, and my friend wasn't readily available to assist me again. Impatient, I didn't want to spend the weekend with a car with no LCD display when that is such an important hub of information on the vehicle. Finally I opened the door between my garage and the house, put a fan in the door, cranked the house AC to 65, and blew cold air into my garage, disconnected the battery and disassembled the dash again myself. Having seen it done, I felt much more confident in trying it myself. It was far less intimidating than I had thought. 

My friend had really done all the installation and while he is very good at the mechanical parts of vehicle disassembly, he doesn't have as much experience with PC-type electrical connections and some of the pitfalls and gotchas involved with that. Right away I noticed that the LVDS connector between the MMI and the LCD was backed out just a fraction of a bit, probably about 1/16th of an inch. I pushed firmly and felt the cable slide snugly into its socket with a feeling of a solid connection between cable and connector. I reassembled the dash, reconnected the battery, and the LCD display was back. 

I wish I could say that resolved all my problems, but the false alarms remain the sole remaining persistent issue. I think this may be related to when the car is armed, I am near the car but far enough away that it can lose signal to the MMI, and the phone is jumping onto other WiFi signals. Today I armed the car for an 8 hour day in the office, and disabled BT and WiFi on the phone, and I did not have a single false alarm. It will be inconvenient if I have to turn BT and WiFi off/on every time I get in or out of the car - but if it fixes the issue, it is a workaround I am prepared to live with. 

The installation instructions provided by Carlinkit are very "Chinesium". They're less clear and detailed than the typical Ikea assembly instructions. The User Operation instructions are even less detailed.  

This is it. Match the numbers on the left to the numbers on the right. 

You basically get a leaflet that has instructions for the 3 different types of iDrive head units, CIC, NBT, or NBT EVO, that your BMW has. Also note, you need to get the right MMI unit depending on which head-unit is in your BMW. CarLinkKit will help you determine that step and is very thorough about making sure you get the right MMI paired to the iDrive installed in your car before the sale. Once you've installed it, the instructions make sense, but before, it is very unclear what goes where and how it all goes together.  To get to the MMI interface, you select Aux Front from the Media menu on iDrive, and then hold down the "menu" button on your iDrive controller for 4 seconds. There is a page that describes what the various top buttons do - but they're not a 1:1 match for what you'll see on your screen. Another page describes roughly what the iDrive controller does and how to make the original BT and WiFi connection between your phone and the MMI. That is it for user instructions. You're basically on your own to figure it out beyond that. 

The Main Menu of the MMI interface



Conceptually there are two components of what the MMI delivers. The main menu is the MMI's configuration and settings. If you're connecting wirelessly via BT and WiFi, and have Android Auto set to auto-launch, you may only briefly see this screen, if you see it at all. Here you can select CarPlay, the Apple version of Android Auto, WireAuto, which is wired Android Auto, Airplay, Autolink, Media, Settings, which are MMI specific settings, AV IN, if you have an aftermarket reverse and/or front camera, and "Exit". None of these are really documented. None of the settings inside Settings are documented. Most of these are pretty self-explanatory - but there seems to be an assumption that as an end-user you're fairly technically fluent and comfortable stumbling around and figuring things out for yourself. 

There is very little in the way of online user tutorials on the web, on Youtube, or in the forums on these devices and their operation, also. Most of the information out there is for iPhone users and CarPlay. Some of that might be useful, though - as many of the key concepts here are the same. 

The initial connection was a little bit hit and miss for me, and I found myself "rebooting" the car (turning it all the way off, starting it completely, stopping the engine but leaving the system running) and playing around with scanning my WiFi and BT connections in order to connect. I can't find a lot of documentation on if both BT and WiFi are necessary to be connected - but my guess is that BT handles phone calls and music streaming, and WiFi handles connecting Android Auto wirelessly - so my feeling is that, "yes - both WiFi and BT need to be enabled and connected," if you want wireless Android Auto. If you want to use Wired through the included USB port, I suppose you could have WiFi disabled on your phone. 

The second component is Android Auto (or AirPlay). Once you get into it - it works just like Android Auto on any other system. In the case of my unit, the factory LCD is not touch-screen on the F23 (it is on the F83, and if the MMI were installed there, touch-screen would work in Android Auto.) This means you navigate Android auto through the iDrive controller - and with a little trial and error, this works pretty well. I am concerned that I end up fiddling with the iDrive controller far more frequently and manipulating it in different ways in Android Auto than with the factory iDrive menus. Voice commands also work well - although sometimes "Ok Google," doesn't work and I need to manually activate the Mic menu from the Android Auto screen. 

I've been able to send and receive texts messages and place and receive calls. On a call the person said they were getting echo, which was not a complaint I had ever received making calls through the factory system - so noise canceling may not be as effective. Also note, your phone can be connected to Android Auto or your factory head unit, but not both, so you can't configure it to take and make calls from iDrive but play music and send and receive texts through the MMI/Android Auto at the same time. 

The music streaming and playback from my phone sounds much better through the MMI than through the factory head unit. I've seen others make this observation too. Bass is deeper and general fidelity seems richer and more full. Not sure how or why that would be - but it makes the system in the M235i sound almost as good as the one in the M4. 

And of course, now I can use multiple music apps, texting and messaging apps, and navigation apps that are all voice controlled and available at all times. I can also use screen mirroring and launch any app that will run on my Note 10+, including retro emulators, video playback, even Chrome. This requires an additional app installed on your phone, called Autolink, which is available from the Google Play store. This is also not described in the instructions. 

Overall, for around $300 I've added Android Auto to my 2016 BMW and improved the in-car entertainment and communication experience significantly. I'm still learning the ropes of navigating around and getting everything connected, and there are some persistent issues with the alarm that were probably introduced by installation of the upgrade or by the presence of the MMI itself. I've added a non-factory device from China to my Internet connected, location aware, remotely controllable automobile, and there are potentially some security concerns with that from the perspective of an IT security professional. But it isn't like the NSA wasn't already hacking into the stock iDrive system, anyhow - so what is one more State actor with access to my private telemetry and other data? The bottom line is there are only a few solutions to get Android Auto and Airplay support on an older BMW and for me, this was the most economical, easiest, and most robust solution that maintained the factory system as original as possible. As for the different vendors selling their own branded version of this MMI - I really don't think that makes a difference. If Bimmertech disagrees, they can send me theirs and I'll put it in my M4 and compare it apples-to-apples with Carlinkkit's version. 

I suspect that is a challenge they won't want to take me up on. 















http://donovancolbert.blogspot.com/2021/08/bmw-m235i-android-auto-mmi-upgrade.html

[#] Thu Sep 02 2021 16:18:00 MST from noreply@blogger.com (Donovan Colbert) <>

Subject: If Not For Double Standards

[Reply] [ReplyQuoted] [Headers] [Print]

 So, it started with telling us that there are no biological markers for race or ethnicity. We're all one species, and our DNA is all exactly the same - there is *no* reason for our obvious physical differences in facial shape, skin pigment, hair color and texture. It is a social construct that arises from physical differences that have to do with geography and not genetics. 


Ok. 


Then they told us that there is no such thing as gender. Gender is a psychological identity and a social construct that has nothing to do with our biological genitalia. 


Hmmm... now hold on... I'm pretty sure that...


"Nope, you dislike pink because of society. Not because you have a penis. End of discussion. Misogynist!" 


The Aisle of Objectification and Oppression


Then they told us sexual preference wasn't just a binary choice, and not even 3 choices - but a *spectrum* - and that there was no such thing as actual heterosexuality. Now, I had always believed that being gay wasn't a *choice*, you were born that way. But evidently being straight *is* a choice, one that is enforced on you by a puritanical heteronormative society that hates LGBTs so much that it has reduced a rainbow of sexual preference that we should all openly enjoy into 2 main choices, straight or gay, and a third choice, "greedy and indiscriminate." Gay, Straight or Bi were not enough to accurately describe a biological female who identifies as a straight man and a biological man who identifies as a gay man who are in a relationship together as a *gay* relationship. One where the biological female who identifies as a male can be a pregnant man carrying the biological child of her and her partner created through regular sexual intercourse. If you had trouble following that - I understand. They are a gay, pansexual couple. I'm pretty sure. I'd have to check the flowchart. 


Not sure where they fit in, but there is a place on the spectrum, for sure.


At this point - even a lot of LGBTs and Liberals started going - "Guys... I think maybe this is getting out of hand..." 


They got called homophobes and told they were just programed by the straight, gay-hating patriarchy. 


But then - this happened. 


We started telling people that women, PoCs and LGBTs needed better and more representation in media and entertainment. It got to the point where *every* romantic couple on AMC's "The Walking Dead," was either interracial, LGBT, or LGBT interracial. Group Identities that manage to be around 10-15% of the population total became 80% of the population in the fictional, ideal zombie-post-apocalyptic society rebuilding itself from the ashes of the collapsed patriarchy. Straight white heteronormative nuclear traditional families will be GONE after the virus wipes out humanity. This show was, at one time one of the most popular shows in the country, a phenomenon that swept the country - and featured a diverse and compelling cast of characters from the beginning. Once it became a platform exclusively for LGBT diversity, ratings fell off a cliff. But more than that, the "representation" didn't reflect actual societal breakdowns at all. Minorities and LGBTs can't just be better represented - they need to be *over* represented in order to "even things out". This is that "equity not equality," thing that has been going around. It isn't enough to give everyone a box so they can see the infield over the fence - you've got to take the box AWAY from the tallest and give the shortest MORE boxes than anyone else. Equity, not equality, right? 




But, little girls, even if they're biologically little boys, should not only have the freedom to define their own gender identity and not have it imposed on them by their homophobic, racist parents - they should have characters they identify with. So should child PoCs. Never mind that they already do - that there is a whole universe of fiction written about powerful black women, powerful gay men, powerful heroes and adventurers and lead characters who are not straight white males. For some reason, those characters never enjoy the fame and popularity of straight white male lead roles. 



But that doesn't mean they weren't bad-asses



So the only solution is to take those straight white male characters... 


And destroy them. 



Now, traditionally, all heroes have weaknesses and liabilities and vulnerabilities and strengths, advantages and special abilities. It is what makes characters complex and relatable. At one time the most powerful being in the Marvel COMICS Universe, not to be confused with the Marvel Cinematic Universe - was Jean Gray, Phoenix - a female. But Marvel always painted cautionary tales about being "the most powerful," in the past. It was fraught with risk and temptation and pressure - with hubris that could cause good intentioned evil outcomes, and with the destruction of personal identity while becoming consumed with the manifest power. That is a pretty interesting thing. And several of the most powerful characters in the Marvel Print Universe were females - and they were *loved* by readers. 

Being the Most Powerful Being in the Universe is manageable for her, though.


But that wasn't enough. The most well known heroes in popular culture, known to all - remained almost exclusively white males. The "Big 4" - Superman, Batman, Spiderman and Wonder Wo... oh... well, she was always super sexualized and objectified and she was white, too... so... you know, whatever. And she is part of the top 4. Everyone knows there is only Win, Place and Show. 4th place doesn't count anyhow... 


But even once you get down to the top heroes that aren't as widely known but still the flagship franchises... you get Iron Man, the Hulk, Thor, Captain America. Wolverine is a Johhny-Come-Lately and we'll ignore him and other creations that happened after the Kirby Golden Era of comics or before. Add in every other character in fantasy escapist literature - and you have Luke Skywalker, Han Solo, James Bond... Of course, you've got to ignore the Sara Conners and Ellen Ripleys and Leelo Dallases of the fantasy/fiction genre with this narrative. But again, whatever. The majority of hero archtypes in literature the whole way back to early Greek epics were... ok... the Greek myths had some pretty bad-ass female Goddesses too... but... for the most part - white males have dominated hero archtypes since the start. That is the story, and they're sticking to it. 


Many of you don't know that alcoholic Tony Stark lost it all in the 80s, and a black guy took over for him then, too. It was a compelling story line - but it wasn't done ham-fisted and obviously.


So. It isn't enough just to create or have popular female fictional characters. They have to make the TOP heroes, the most powerful heroes, the most POPULAR box-office heroes - females, LGBTs and PoCs. 


So how do you do that? Well, you do something called a retcon. You change the "canon" of the hero. The easiest way to do that, and it is something we've seen over and over again - is you make the original, white male hero, ultimately a failure. Burned out, bitter, unsuccessful in his personal life, he retires to an apartment where he lives alone as a recluse and starts eating too much and neglecting his role as hero. Spiderman does this in Spider-Man, Into the Spider-Verse. His salvation is passing off his mantle to an extra-dimensional black teen who becomes the *real* Spiderman in his universe when the former one dies. 


But we saw Luke turned into a hermit living on an island on a remote uncharted planet, molesting space cows for their blue milk. Bitter and jaded with no desire to be involved in the affairs of the galaxy. We saw Han Solo reverted into a perpetual Peter-Pan complex driven scoundrel, who cheated on Leia, became an absentee parent, and returned to a life of intergalactic crime with a giant walking carpet as his co-pilot. Seems like after his assistance with the Rebellion - that would be a pretty easy description to recognize for someone who wants to be a *smuggling space pirate*... but again, whatever. Thor? Lost his mojo, became fat, drinks, plays video game, and hangs out with loser friends. The Hulk? Wait... isn't he green? Not as Bill Baxter... so he becomes a medley of both characters and becomes an insufferable baizou virtue-signaling Academic muscle head. Even Captain America gives it up and hangs up the shield, having learned that there are others that can wield his shield more effectively than he can. Iron Man simply dies. 



And then someone else picks up the mantle. Some one else - a female, is worthy to lift Mjolnir. Captain America's shield will be wielded by a black man. Not sure what will happen to the Hulk - he is still HALF green, after all.


And those that take on the legacy - are *superior* at it. They're more powerful, they're more honorable, they're less imperfect and flawed.


So remember - the reason given for not just cultivating powerful female, PoC and LGBT characters made from whole cloth - but for appropriating already established characters to transform them *into* female, PoC and LGBT characters - is so that females, PoCs and LGBT people have popular characters that they can identify with. 


But wait... we've destroyed the legacy of *all* of the straight, white male heroes. What characters will straight white little boys identify with? 


Well, evidently they don't *need* those characters to role model the ideal of what they should aspire to be. I guess they can resign themselves to the idea that *they* are destined to fail and become bitter and disappointed in their inability to achieve their goals - or resign themselves to the fact that the only way they can have value to society is by empowering everyone who *isn't* a straight white male. That is how bitter, broken down Peter Parker redeems himself in Into The Spiderverse - and it is also how Haymitch Abernathy redeems himself in the Hunger Games, and how Luke Skywalker redeems himself in the Disney Star Wars trilogy. They pass on what they know to a woman or PoC who is a better person than they were, and then they stand aside once that job is done. 

Get out of the way, flyboy, and let someone competent make the decisions.


Of course, just like for decades, little girls and PoCs were actually able to identify with and relate to the white male heroes, to enjoy and aspire to be like Batman, Spiderman, and Superman - the little white boys are now free to identify with the female, PoC and LGBT heroes who have replaced them. In fact, because gender, sexual preference and gender identity are all just social constructs - 


Wait a minute, once again... 


Little girls, PoCs, LGBTs - they need representation, right? But straight little white boys evidently don't need any representation, and it isn't a problem, because being a straight little white boy is just a social construct? Ethnicity is a social construct, remember, so is gender identity, and sexual preference is a *spectrum*. Everyone starts out gay, and society makes them pick a team, except for gay men, who are born that way, they don't make a choice. So... why does representation matter for females, PoCs, LGBTs if it is all a construct, but it *doesn't* matter to straight white males, *because* it is all a social construct? I'm having trouble following when it matters and when it doesn't and why... 

Let's be clear - my criticisms of this ideology and its value systems is no reflection on my feelings about media being more representative of minorities. I think there is nothing wrong with communities, cultures or lifestyles writing their own fiction that exclusively features and is targeted toward those specific communities. Written well enough - with human, relatable characters, those "whole-cloth" stories can have considerable crossover appeal. I fully support this. Steven Universe is not written for me, I'm not its demographic. But if someone can get the idea green-lighted, get it produced and published, and find a market for it and make it a success - then this is something that should exist.  In fact, maybe Steven Universe is probably the blueprint for what "representation" should look for. Stories written for a specific target market that have little or nothing to offer to those not in that target demographic. 



There is nothing for you here, Boomer, move along.

The problem is taking He-Man, or The Thundercats, James Bond, or The Lord of the Rings, and re-imagining and recasting it for the specific sake of "inclusivity." If we do it the *other* way around, it gets called "appropriation," and that is a BAD thing. Something we're not allowed to do. Something that is insensitive. But when we make Ariel a little black Mermaid - it is celebrated as progressive and a victory for inclusivity. Let's do it with everything... in fact... 

Let's just recast every character EVER as Idris Elba.
That'll solve the problem once and for ever. 


The contradictions and hypocrisies and double standards in this ideology are so blatant - and yet we've become a society that is uncomfortable even suggesting this is the case - and those in control of the information frequently label it "hate speech," "misogyny" and "racism" if you point out the inconsistencies you see and try and have honest discussions. You're not *allowed* to ask questions about the plot holes in the story you're being told - and if you do - you're perpetuating oppression, hatred and discrimination. They'll shut down your account, they'll try to "cancel" you. If you disagree with them or even question things that don't make sense - you're probably secretly a Nazi - and what the world needs is a female, black, Captain America who is super powered to beat the living snot out of average citizens like you. 

And this is how we get eq-ui-ty!





http://donovancolbert.blogspot.com/2021/09/if-not-for-double-standards.html