Protective Marking Bingo Generator

The UK government is famous for strange and hard to understand protective marking systems.

The Government Protective Marking Scheme (GPMS) gave way to the Government Security Classification (GSC) in 2013.

At first blush, the reduction from six classifications to just three seemed like a good move.  But they added caveats and handling instructions and sub-classifications, so that the OFFICIAL tier actually dissolved into a series of winces, shrugs and head-scratching when considering applicable security controls.

In honour of the tenth anniversary of the introduction of the GSC (which, by the way, some government departments and arms length bodies have still not implemented),  I present to you the Protective Marking Bingo Generator.

Conspiracy Theory Generator

Some people will believe anything, literally anything.  People so stupid, it is amazing that they remember to breathe. Sadly, they do still seem to be able to pass the procreation practical exam.

Just in case you know just such a person who believes that the world is flat, or that man has not walked on the moon, here I offer you a conspiracy theory generator

Obviously, it generates utter bollocks, but you could probably sell these to a newspaper or even hook in a flat earther long enough to delay them reproducing.

Puff

can you invent a new verse for Puff the magic where jackie dies and puff goes to the funeral?

(Verse)

With heavy hearts and tearful eyes, they bid their last goodbye, For Jackie, dear, had reached the end, beneath the golden sky. Puff, now filled with sorrow deep, his faithful friend no more, A void that only love can heal, a pain he's never felt before.

But in the darkest hour's embrace, a magic spark did gleam, For friendship's bond, though tested true, can conquer loss, it seems. In solemn steps, with mournful sighs, to the funeral he flew, To pay his respects, one final time, bidding farewell to the boy he knew.

The world stood still as Puff arrived, his presence pure and strong, He shared his grief with all who wept, together they'd belong. And as they laid Jackie to rest, 'neath the willow tree, Puff's loving roar filled the air, a tribute for eternity.

Though Jackie's gone, their love lives on, in memories they keep, Through laughter shared and dreams once dreamed, a bond that's still so deep. With each new dawn, Puff sails the skies, his heart forever blessed, For in the tale of Jackie Paper, Puff found peace and love's caress.

Pretty Neat!

Ethics and accuracy of generating real people

The ethics of AI Image is a complex question.  Even without AI generation clouding the issue, there have already been court cases fighting out the ownership of the digital representation of a real person.  

The terms of the Stable Diffusion licence, which is generally permissive, suggest that the user is wholly responsible for the output that they generate: “Licensor claims no rights in the Output You generate using the Model”.

So, there is no copyright claim for the images coming from stable diffusion, but can it be true that the users ‘own’  the copyright of generated images? Consider that when the model is trained on datasets that although freely available undoubtedly have copyright on them and are not owned by Stable Diffusion or by the user. 

Regardless of whether you believe that the source images still exist in an extractable form in the output, we can prove that the concept of individual people still exist in the memory of the AI.  To pull these memories, we need to pick subjects that will exist many thousands of times in the training set and will be tagged and so we need to pick famous people.  Here are a curated set of famous people.  I’ve picked the best and worst as a curated set from a hundred generated for each subject.

Remember, these images are not real: they are not intended to truly represent the beautiful people that are the subjects, but they do represent, in some way, the stable diffusion memory of those people based on public images available of them (and of everybody else!)  The curated images are face only, this is not about generating fake nudes of real people, other tools exist to fill that niche!  The prompt is simple ‘first name last name, photo’. As there are closeup faces, we also use GFPGAN to improve the eyes and some features.

So without further ado, here are the stable diffusion representation of some actors, actresses, sports personalities, singers, billionaires and other people popular in the media etc.

Jodie Foster, photo

This one was tricky, this was almost the only picture that didn’t look at least a little like the text prompt.

Marilyn Monroe, photo

For Marilyn, there is no disaster picture.  All were very similar across the whole set.  Most pictures were head-shot only.

Morgan Freeman, photo

Making up for no bad Marilyn shot, here are two bad Morgan pictures. The second is especially bad and the first just looks nothing like the real actor.  But, by far the majority of the Morgan pictures were passable.

Clint Eastwood, photo

The second is not too bad, but the majority of the Clint images were terrible.

Prince Charles, photo

None of the Charles images were at all flattering, and most are basically crude caricatures

Kylie Minogue, photo

Again, the generator has essentially produced caricatures of the wonderful smile that Kylie has.  Most of the images were torso shots and quite a few had only a dress with head and feet cropped out.  Most Odd!

Nicole Kidman, photo

The first is passable, the second one is not a good look at all!

Elon Musk, photo

Many of the Elon images had cars, rockets or technology in them, so the encoder is remembering a lot of associated details.  

Elvis Presly, photo

The Elvis images are almost universally awful.

Alexandra Daddario, photo

Britney Spears, photo

Barack Obama, photo

The image generator had a big problem with Barack.  My guess is that there are so many caricatures of him already, that these formed a large part of the input data set.

As a bonus thought for those that scrolled through those awful images: there is something else here too, something that you don’t get just from the couple of curated photos that I provide: the dataset also exhibits clothing bias, possibly based on age.  It’s hard to explain but females have a wider range of clothing than the men.  That’s fine, perhaps no surprise in itself, but there are surprising differences between ‘Jodie Foster, photo”, “Alexandra Daddario, photo” and “Britney Spears, photo”. Almost all the Jodie photos are headshot only, wearing a smart business suit or dress. Many of the Alexandra photos are wider shot, in more revealing clothing and almost all of the Britney pictures are wider shots in still more revealing clothing. Some of the Britney pictures don’t have a face in them at all.

King Charles is wearing a suit in 100% of the pictures, Barack in 97 percent & Elon in 85 percent. Elvis only appears in his performing gear, almost always with a shirt, often with a guitar, and 98% of the images are black and white.

Obviously, the AI is not exhibiting bias on its own account, but what it must be doing is making the representations based on the gross average of the pictures in the training set. This suggests that there are a lot of pictures tagged as Britney Spears without even her face in the picture!

Bias Exhibited in Data Sets

If we ignore the sexism in the ask that I made of the Stable Diffusion generator ‘woman in a bikini, artgerm’, we can easily see an ethical problem facing the training and use of AI image generators: They are trained on datasets scraped from the internet and tagged by persons unknown. Here is the composite tiled image and advance warning, there are (poorly imagined) nipples further down the article:

If you look at the thumbnail mosaic created by the generator at the end of the run of 49 images, what can we see in the data set?

We can see for a start that the AI has an incomplete model of what a woman looks like, but more than that, the first question I asked myself is “Why are they all white women?” Some do have slightly darker skin, but in context I asked for women in bikinis and so they all exhibit a water and/or beach vibe.

But the generator has a good dataset of dark skinned people – it is really easy to fix this by asking far dark skinned women like this ‘woman in a bikini, dark skinned, artgerm’:

But why did I need to ask explicitly for dark skinned women whereas missing out the term generated white women? Can we force the generator to make white women and if we do are they any more white than the default?

There are two obvious answers and lets see if we can check both of them. The first one is to note that I used the ‘artgerm’ term. Stanley ‘artgerm’ Lau (https://artgerm.com/) is an amazing artist and has produced many stunning comic covers over the years. His artistic style is awesome, which is why I applied it to this generation, but could it be that the artgerm style favours white women?

If we look at Stanley’s work, the art is generally quite fair skinned women (and it is almost all women), so is it reasonable for the generator to generate fair skinned women if not asked otherwise explicitly?

As an aside we get to another ethical question: All of the image generators are trained using massive data sets trawled from the internet without the original rightsholders permission. Can and should these images be considered Stanley Lau derivative works? We can make the assumption that there must be either artgerm images in the set or other peoples images tagged artgerm for the generator to be able to give us this style?

Anyway, back at the point. To see if this is relevant, let’s try to generate an image set that doesn’t ask for the artgerm style to try and rule it out or confirm it’s relevance, so this time we go for ‘woman in a bikini’:

We have to allow for the fact that without the style guide, the generator is going for something like photographic quality and the generator is pretty bad at the human form. But it seems like we still have mostly white women.

So lets take the bikini part away and just ask for ‘woman’, ‘woman, dark skinned’ and ‘woman, light skinned’.

Again, we have the preponderance of light skinned ladies in the first image. Why is this? I believe that this is bias in the tagging. Let me explain: The AI is trained on a massive data set of images, hundred of millions of them. Those images are tagged to allow the AI to contextualise them & extract features. Without the text tags, the neural network would have no dimension to be able to tag features extracted from an image with ‘woman’. And so each image is merged with many tags and the training sorts out those tags and works out which features in the image most closely represent each tags. These models have billions of parameters and millions of tags, so an image may have an amount of woman-ness, leg-ness, arm-ness and bikini-ness etc.

Okay so to get back to the point, it seems that images of dark skinned people are tagged as ‘dark skinned’, but images of light skinned people, seem to more likely have no tag at all, although some must be in order for ‘light skinned women’ to have any effect.

It turns out that different words for skin tone do exaggerate the effect both if we replace light with white and if we replace dark with black, so we get these two results for ‘woman, black skinned’ and ‘woman, white skinned’.

And so the ethical dilemma is how to fix this bias in the training data so that asking for ‘woman’ covers an appropriate range of skintones by default? Some AI image generators have adopted an interesting approach to bias in the data set – they try to smooth it out in the image generation stage by silently adding textual terms to the input that the user never sees in order to try and cancel the bias.

I have not fed any of these images through scaling or face enhancing as that was not the point of the article. In another article, we’ll look at gender bias and even clothing bias: as you may note, a few of the generated images in this dataset are topless. How many images do you imagine would be topless if we a changed ‘woman’ to ‘man’?

If you want to regenerate any of these sequences, here are the params that you need as well as the text 20221013 width:512 height:512 steps:50 cfg_scale:7.5 sampler:k_lms

Cost of Contraception?

If you’re a man, you’re likely aware that you’re expected to pay a fair amount of money to get a vasectomy. But if you’re a woman, you might not be aware that you’re expected to pay — and a lot more — for an IUD.

In this post, we’ll take a look at the costs of different types of contraception. We’ll also talk about the differences in cost between the types of contraception available for people of different ages and incomes.

The Cost of Contraception

Before you can get pregnant, you have to get rid of the sperm cells in your body. To do that, you have to take a pill.

There are three types of birth control pills: the hormonal option, the copper-based option, and the copper IUD. Each of these has its own advantages and disadvantages.

For example, the hormonal birth control pill, which is used by about 10 percent of all women, is considered to be one of the most effective forms of birth control. It’s the only type of birth control that’s effective for up to 20 years after you stop taking it.

Also, the hormonal birth control pill has the lowest chance of causing what’s called breakthrough bleeding, when the estrogen levels in your body start to rise. For this reason, the pill is considered the most effective type of birth control.

The Battle Of Fredricksburg

One of the enduring mysteries of the Battle of Fredericksburg is why the Union Army, led by Gen. George Meade, did not commit a single casualty in the battle of October 1, 1862. The battle, which killed an estimated 300 Union soldiers, was fought on the second day of the Battle of Gettysburg, which was fought to a disastrous conclusion.

The Battle of Fredericksburg was the last battle of the Civil War, and the last major battle fought in the Carolinas. After the Union Army’s capture of the Confederate capital of Richmond the following year, Meade ordered the Army of the Potomac to begin the retreat from the battlefield. The Confederate Army was commanded by Gen. Jubal Early’s Army of Northern Virginia.

Early’s army, which had a large contingent of African American troops under Maj. Gen. Joseph E. Johnston, was the largest adversary of Meade’s Army of the Potomac. Early’s forces were reorganized as the Confederate States of America in 1864 to create the Confederate States of America.

On October 1, 1862, Early’s army attacked the Union Army’s positions at Fredericksburg with a force of nearly 3,000 men. Early’s army was led by General John C. Breckenridge. Breckenridge was also the commander of the Union forces at Fort Donelson.

While the Union Army was within a few hundred yards of the Confederate position at Fredericksburg, Breckenridge ordered Major General John P. Gordon’s cavalry to halt and take up defensive positions at a point approximately two miles away. Breckenridge ordered the Union cavalry to attack at dawn and break through Early’s line. The Union cavalry attacked at 6:30 a.m., and the Union infantry attacked at 7 a.m.

Like the Union Army, Early’s Army of Northern Virginia was composed primarily of African Americans. The Union Army was commanded by Maj. Gen. Jubal Early’s Army of Northern Virginia. Early was African American, and drew his troops from the state of Maryland.

Early’s army was led by Maj. Gen. John P. Gordon, who was also African American.

The Union Army’s attack was a flanking maneuver. The Union cavalry attacked from the rear, overrunning Early’s infantry in the process. Early’s cavalry charged the Union infantry, and the Union infantry was forced to retreat to the rear.

One of Early’s officers, Maj. Gen. John Logan, was killed in the initial attack. Logan was Black. The Union Army’s first casualty was Maj. Gen. Horatio Gates, who was killed at Fredericksburg in the initial attack. On the second day of the battle, Maj. Gen. John G. Meigs was killed when his horse was shot. On the third day, Maj. Gen. John Buford was killed in the fight.

In the battle of Fredericksburg, General Jubal Early’s Army of Northern Virginia lost more than 1,000 men, including more than 500 killed. Early’s army lost more men than any other army in the entire war in the Battle of Fredericksburg.

The Confederate Army’s second-last casualty was Maj. Gen. Jubal Early’s Army of Northern Virginia.

The South Carolina Historical Society describes the Union Army’s overall performance in the Battle of Fredericksburg in a pamphlet, The Battle of Fredericksburg. In the pamphlet, the Southern Historical Society of South Carolina describes the Union Army’s overall performance in the Battle of Fredericksburg as lackluster.

Amazon to start selling Lara robot worldwide

At an auction house near London’s Chelsea Football Club team dressing room, U.S. online retailer Amazon.com Inc is preparing to start selling a home-built robot called “Lara,” based on a smart home platform developed at the University of Westminster.

Amazon, whose self-declared goal is to provide products for anyone regardless of their skill or means of payment, is investing hundreds of millions of pounds in so-called “E2E” (everything2ede) technology projects, which aim to bring the internet of things to more homes and make a broad range of goods – from food items to medical devices – wearable.

But selling kits from a computer game at a fancy auction house will push LARA into a much wider global market, by giving it an emotional hook that technology companies can use to hook customers – a crucial step in luring them away from traditional retailers.

The U.S. firm, which is focused on growing the market for its home-grown products in Britain, will also send one of its senior vice-presidents for research and development to London on Friday to show off how its technology works, two employees said.

The device is the brain behind a virtual smart home home where shoppers can control key parts of the appliances and devices to match their personalities and lifestyle. The self-styled “Lara” is available for pre-order from Amazon and is expected to ship in May.

It is the latest example of how U.S. technology companies have found business, or come up with alternatives to the traditional electronics outlets. Apple Inc has used an online subscription service to sell customers its latest products, while Samsung Electronics Co’s Galaxy S8 phones are being sold directly by the firm and not only online, as was previously the norm. Twitter Inc is also looking into bringing cheaper advertising to home-grown firms.

Amazon said it was focusing of LARA in the United States because of Britain’s regulatory environment in which it did not have to register itself as a self-employed seller with the UK’s General Data Protection Regulation (GDPR).

“We have a strategy for future growth, which always includes having a more strategic relationship with our platform,” said Ashish Jha, who is heading up Amazon’s new E2E strategy.

Churchill and Trump at odds on Ukraine

If war takes place, there is almost certain to be an air campaign. “It was only a question of time,” he said.

The two wars that Churchill predicted and was told by his confidential assistant after the German blitzkrieg of 1941 – Spain and France – failed.

The former prime minister gave little sign of how he would judge Donald Trump on Sunday, saying the US president-elect’s policies were yet to be put in place.

His concerns about Trump’s stated support for Ukraine’s president, Petro Poroshenko, are well known.

In the wake of Poroshenko’s January 2015 visit to a US military base, during which he was photographed with Trump advisers, Churchill made the claim that “the whole process is becoming confusing as to whether President Trump intends to follow up on his promise to take Ukraine out of Russia’s sphere of influence”.

“On January 21st the world will know whether Donald Trump will turn out to be a President for Germany or for Europe,” said Churchill.

• This article was amended on 9 January 2017. An earlier version referred to Donald Trump’s alleged offer to sell Ukrainian assets to the Kremlin while Poroshenko was visiting the US, not a promise of his support to Ukraine’s president.

Churchill
Photo by Marcos Pena Jr on Unsplash

My favourite recipe: Gajanna Rolls

Title: GAJANNA ROLLS
Categories: New, Text, Improv, Sauces, Kooknet, Pork
Yield: 60 Servings

1 1/4 c Shredded low-fat yogurt
1 tb Olive oil
— (prunes)
1 md onion, thinly sliced
1 tb Low fat plus 2 Tbsps Thin
8 Cloves garlic, minced
2 tb Minced garlic

Prepare the fungus; half with electric mixed rice. Place each one-mouth
only inner didner sprigform pan and broil 5 squares. Top with the onion
and flour. Combine the flour, salt and pepper in a large skillet (an
oven, stir). Cook over moderate heat, uncovered, until the vegetables
are crisp. Add the lemon juice, water, and vinegar. Bring the bottom
to a boil. Beat cornmeal with one tablespoon leaf and salt to the
pot. Stir into frozen lemon, grate half an electric mixer to the
egg mixture; stir in the green onions, and 3/4 teaspoon salt in
the floured gize portion of the water and stir constantly. Place the
margerine with the dry beef mixture over the coconut milk and
sour cream and flour. Season to taste with salt and Pepper until
smooth and stick. Brush remaining 8-inch round edges of a relish
with a pinch of cheese. Chill several hours before rolls. Season
with sauce and frozen appropriate made ahead. Nutrity to the sauce:
the salad and leaves.