mouthporn.net
#hci – @aeolianblues on Tumblr
Avatar

aeolianblues

@aeolianblues / aeolianblues.tumblr.com

Amateur writer and cartoonist, trash poetry specialist, musician, punk radio host, computer science student and enthusiast. Muser, hi hello! Museblogging at @sunburnacoustic. Disastrously cooking at @vengefulcooking
Avatar
Avatar
elkian

One small but extremely annoying effect of Tech Modernization or w/e is how UI contrast is garbage anymore, especially just, like, application windows in general.

"Ooh our scrollbar expands when you mouse over it! Or does it? Only you can know by sitting there like an idiot for 3 seconds waiting for it to expand, only to move your cursor away just as it does so!" or Discord's even more excellent "scrollbar is 2 shades off of the background color and is one (1) pixel wide" fuck OFF

I tried to move a system window around yesterday and had to click 3 times before I got the half of the upper bar that let me drag it. Why are there two separate bars with absolutely nothing to visually differentiate them on that.

"Well if you look closely-" I should not!! have to squint!!! at the screen for a minute straight to detect basic UI elements!! Not mention how ableist this shit is, and for what? ~✨Aesthetic✨~?

and then every website and app imitates this but in different ways so everything is consistently dogshit to try to use but not always in ways you can immediately grok it's!!!! terrible!!!! just put lines on things again I'm begging you!!!!

Avatar
reblogged
Avatar
aeolianblues

Coding isn’t all there is to computer science/software development

So our prof decided to give us feedback on every group’s work for a project we just finished, and roughly speaking, there were three categories: 1. Most inventive, particularly loved this 2. Well thought out, good implementation 3. Ok, good work

Our project was in group “At least you tried”

((((((:

This is for an HCI (human computer interaction) course, the only reason I’m doing this degree at all is because I want to go into HCI and the ONE course I get on that subject in my entire degree, the best we could do is “at least you tried” I want to punch a wall.

The thing is, I knew it was going to be this way? She’d specifically said, try not to do an app because it’s all been done (and I agree). But the thing is, to use like an e-textile/microcontroller circuitboard or anything you can’t just download and upload to a shared folder, you need to be in proximity to each other. During a pandemic? Nuh-uh! We were not only some thousand kilometres away, we also had massive time differences! We weren’t working together, we were a unit dividing up work to do individually.

Second very specific problem, and I’m about to get bitter here so turn away if you’d like but it’s something I see so often in computer science it drives me wild. People will act like coding and software development are the end of the world. They’ll prioritise it so much that anything that’s not building a Java applet is somehow less and pointless. Someone told me they were disappointed that this semester had fewer coding-based projects. Others in our class have been moaning around going “why is an HCI course compulsory for a software major?? ugh, useless course”, well maybe guys, because changing interfaces have ALWAYS been a part of computer science? Just because apps have been around for ten-ish years since Apple’s iPhone sparked off the mobile phone/development trend doesn’t mean it isn’t going to evolve. Just because it’s all you’ve seen growing up doesn’t mean it’s the only thing out there. It doesn’t mean the app format is the end-all and be-all of the computing world.

It’s been done. To death! Also, even for a mobile app (code word for software major porn me thinks, come on now), if your app is badly laid out and confusing to use, no one will use it. Heck, even if it’s useful, something as small and stupid as “I wish I could have a dark background” is enough to make someone not want to use your app; people are stuck up and stupid like that. So maybe shove your disdain for how she literally HAD to teach you psychology and design principles up your arse, love? If we’d have been so stuck up about what constitutes computer science and what doesn’t, you wouldn’t have a mouse today. Enjoy your ASCII-everything because psychology had a big hand in people understanding how to use a mouse in the first place; coming from a purely software outlook, you’d have thought it couldn’t be done and no one would understand it. Heck, we’d be programming punch cards still, now that’s pure computer science isn’t it.

My point is, I could swear some people had only refrained from dropping out of this course so that they could write some code and show up some usable code to put on their resume with a github link, and I know I’m not talking very nice here, but it’s how I saw it. Yes, our group did an app as well. I genuinely wish we’d done something more involvedly HCI, and uurghh it makes me sad because that’s all the HCI experience I’m gonna get from my entire university career. And it’s done. Over. Blown it. 

I just wish people would stop slagging it off, it’s the most interesting field of computer science. It’s the most relevant in a sense, because you’re going to have to use it anywhere you ever think of a computer. Whether you’ve formally studied it or not, you have used some principle of human computer interaction. If you’ve ever used and understood a feature of anything remotely resembling a computer, you’ve employed HCI yourself.  Used an ATM? HCI! Done up an Excel sheet? HCI! (and a braver person than I… I’ve written a python program once to draw up graphs from data I had because I was too intimidated by Excel). Are you reading this text post on a glass/plastic screen, and have understood that the characters off the screen are meant to be read in plain language the same way you’d read off a paper on a pamphlet or a book? H.C.I.! See? It’s everywhere! 

Anyway, that’s it. It’s just disappointment at a wasted semester and a wasted degree. I wish I knew more about HCI. I really wish I did.

@lastbenchpapers here you go!

A lot of this stuff is still in one of two wildly opposite spheres: still in research, or in DIY spaces (which if you think about it, are basically the same thing, but one writes blog articles and the other writes research papers and gets some tiny funding. Otherwise they’re very similar!)

Basically the idea is that you have conductive wires which instead of going inside a solid plastic computer case, gets used to sew fabrics while also completing circuits. You attach a microcontroller to these circuits (usually an Arduino, but there are smaller, more elegant microcontrollers specialised for fabric work such as LilyPad or Adafruit you can find nowadays), and so these beaded ends of the fabric really hold arduino sensors that can sense anything from tilt, temperature, humidity, heart rate, you name it! Like sure, AR/VR/MR is the loudly trumpeted future of computer interaction, but just look at how convenient wearable technology would be and try and tell me it’s not destined to be the inevitable future of computer technology! It’s everything present-day computers strive to be: accessible, unobtrusive, compatible with your lifestyle, but it does it in completely different, maybe even organic way.

(They’d easily pass for traditional embroidery too)

Anyhow, I’ll shut up now, I could go on about HCI for hours, and I’ll leave you with some of these links. Enjoy!

Beaded tilt sensor (this site’s got a bunch of other computers you could make on the left sidebar too):

Why would this be useful? Here’s one possibility suggested by researchers: mouses that can be controlled by head tilts for people with disabilities!

As used in pillows: decorative, and could potentially make smart pillows with tracking, say a pressure sensor feels your head slump in as you fall asleep and triggers a switch to draw curtains for you (retractable fabrics are another cool thing I got to see during my course!)

There’s other cool stuff you can do with conductive threads (you may have seen that viral video where they light up bulbs by touching a needle—conductor—to the conductive threads used to embroider the thing). You can basically sew up a sound system if you wanted, the possibilities are endless! (Found one for you here)

I’ll reblog this when I have a few more resources, if I can still access my course website, I remember Instructibles is also a great website to search around on but yeah, HCI can definitely be the sort of cool hacks a DIY website would teach you, and at the same time, it’s such an important area of development because kids probably won’t recognise technology as we use it in 20 years time.

I know I've talked to a few people about this recently, so I'm just going to reblog this post again for you all. HCI is certainly very cool!

Avatar
Avatar
aeolianblues

Coding isn’t all there is to computer science/software development

So our prof decided to give us feedback on every group’s work for a project we just finished, and roughly speaking, there were three categories: 1. Most inventive, particularly loved this 2. Well thought out, good implementation 3. Ok, good work

Our project was in group “At least you tried”

((((((:

This is for an HCI (human computer interaction) course, the only reason I’m doing this degree at all is because I want to go into HCI and the ONE course I get on that subject in my entire degree, the best we could do is “at least you tried” I want to punch a wall.

The thing is, I knew it was going to be this way? She’d specifically said, try not to do an app because it’s all been done (and I agree). But the thing is, to use like an e-textile/microcontroller circuitboard or anything you can’t just download and upload to a shared folder, you need to be in proximity to each other. During a pandemic? Nuh-uh! We were not only some thousand kilometres away, we also had massive time differences! We weren’t working together, we were a unit dividing up work to do individually.

Second very specific problem, and I’m about to get bitter here so turn away if you’d like but it’s something I see so often in computer science it drives me wild. People will act like coding and software development are the end of the world. They’ll prioritise it so much that anything that’s not building a Java applet is somehow less and pointless. Someone told me they were disappointed that this semester had fewer coding-based projects. Others in our class have been moaning around going “why is an HCI course compulsory for a software major?? ugh, useless course”, well maybe guys, because changing interfaces have ALWAYS been a part of computer science? Just because apps have been around for ten-ish years since Apple’s iPhone sparked off the mobile phone/development trend doesn’t mean it isn’t going to evolve. Just because it’s all you’ve seen growing up doesn’t mean it’s the only thing out there. It doesn’t mean the app format is the end-all and be-all of the computing world.

It’s been done. To death! Also, even for a mobile app (code word for software major porn me thinks, come on now), if your app is badly laid out and confusing to use, no one will use it. Heck, even if it’s useful, something as small and stupid as “I wish I could have a dark background” is enough to make someone not want to use your app; people are stuck up and stupid like that. So maybe shove your disdain for how she literally HAD to teach you psychology and design principles up your arse, love? If we’d have been so stuck up about what constitutes computer science and what doesn’t, you wouldn’t have a mouse today. Enjoy your ASCII-everything because psychology had a big hand in people understanding how to use a mouse in the first place; coming from a purely software outlook, you’d have thought it couldn’t be done and no one would understand it. Heck, we’d be programming punch cards still, now that’s pure computer science isn’t it.

My point is, I could swear some people had only refrained from dropping out of this course so that they could write some code and show up some usable code to put on their resume with a github link, and I know I’m not talking very nice here, but it’s how I saw it. Yes, our group did an app as well. I genuinely wish we’d done something more involvedly HCI, and uurghh it makes me sad because that’s all the HCI experience I’m gonna get from my entire university career. And it’s done. Over. Blown it. 

I just wish people would stop slagging it off, it’s the most interesting field of computer science. It’s the most relevant in a sense, because you’re going to have to use it anywhere you ever think of a computer. Whether you’ve formally studied it or not, you have used some principle of human computer interaction. If you’ve ever used and understood a feature of anything remotely resembling a computer, you’ve employed HCI yourself.  Used an ATM? HCI! Done up an Excel sheet? HCI! (and a braver person than I… I’ve written a python program once to draw up graphs from data I had because I was too intimidated by Excel). Are you reading this text post on a glass/plastic screen, and have understood that the characters off the screen are meant to be read in plain language the same way you’d read off a paper on a pamphlet or a book? H.C.I.! See? It’s everywhere! 

Anyway, that’s it. It’s just disappointment at a wasted semester and a wasted degree. I wish I knew more about HCI. I really wish I did.

@lastbenchpapers here you go!

A lot of this stuff is still in one of two wildly opposite spheres: still in research, or in DIY spaces (which if you think about it, are basically the same thing, but one writes blog articles and the other writes research papers and gets some tiny funding. Otherwise they're very similar!)

Basically the idea is that you have conductive wires which instead of going inside a solid plastic computer case, gets used to sew fabrics while also completing circuits. You attach a microcontroller to these circuits (usually an Arduino, but there are smaller, more elegant microcontrollers specialised for fabric work such as LilyPad or Adafruit you can find nowadays), and so these beaded ends of the fabric really hold arduino sensors that can sense anything from tilt, temperature, humidity, heart rate, you name it! Like sure, AR/VR/MR is the loudly trumpeted future of computer interaction, but just look at how convenient wearable technology would be and try and tell me it's not destined to be the inevitable future of computer technology! It's everything present-day computers strive to be: accessible, unobtrusive, compatible with your lifestyle, but it does it in completely different, maybe even organic way.

(They'd easily pass for traditional embroidery too)

Anyhow, I'll shut up now, I could go on about HCI for hours, and I'll leave you with some of these links. Enjoy!

Beaded tilt sensor (this site's got a bunch of other computers you could make on the left sidebar too):

Why would this be useful? Here's one possibility suggested by researchers: mouses that can be controlled by head tilts for people with disabilities!

As used in pillows: decorative, and could potentially make smart pillows with tracking, say a pressure sensor feels your head slump in as you fall asleep and triggers a switch to draw curtains for you (retractable fabrics are another cool thing I got to see during my course!)

There's other cool stuff you can do with conductive threads (you may have seen that viral video where they light up bulbs by touching a needle—conductor—to the conductive threads used to embroider the thing). You can basically sew up a sound system if you wanted, the possibilities are endless! (Found one for you here)

I'll reblog this when I have a few more resources, if I can still access my course website, I remember Instructibles is also a great website to search around on but yeah, HCI can definitely be the sort of cool hacks a DIY website would teach you, and at the same time, it's such an important area of development because kids probably won't recognise technology as we use it in 20 years time.

Avatar
reblogged
Avatar
aeolianblues

Coding isn’t all there is to computer science/software development

So our prof decided to give us feedback on every group’s work for a project we just finished, and roughly speaking, there were three categories: 1. Most inventive, particularly loved this 2. Well thought out, good implementation 3. Ok, good work

Our project was in group “At least you tried”

((((((:

This is for an HCI (human computer interaction) course, the only reason I’m doing this degree at all is because I want to go into HCI and the ONE course I get on that subject in my entire degree, the best we could do is “at least you tried” I want to punch a wall.

The thing is, I knew it was going to be this way? She’d specifically said, try not to do an app because it’s all been done (and I agree). But the thing is, to use like an e-textile/microcontroller circuitboard or anything you can’t just download and upload to a shared folder, you need to be in proximity to each other. During a pandemic? Nuh-uh! We were not only some thousand kilometres away, we also had massive time differences! We weren’t working together, we were a unit dividing up work to do individually.

Second very specific problem, and I’m about to get bitter here so turn away if you’d like but it’s something I see so often in computer science it drives me wild. People will act like coding and software development are the end of the world. They’ll prioritise it so much that anything that’s not building a Java applet is somehow less and pointless. Someone told me they were disappointed that this semester had fewer coding-based projects. Others in our class have been moaning around going “why is an HCI course compulsory for a software major?? ugh, useless course”, well maybe guys, because changing interfaces have ALWAYS been a part of computer science? Just because apps have been around for ten-ish years since Apple’s iPhone sparked off the mobile phone/development trend doesn’t mean it isn’t going to evolve. Just because it’s all you’ve seen growing up doesn’t mean it’s the only thing out there. It doesn’t mean the app format is the end-all and be-all of the computing world.

It’s been done. To death! Also, even for a mobile app (code word for software major porn me thinks, come on now), if your app is badly laid out and confusing to use, no one will use it. Heck, even if it’s useful, something as small and stupid as “I wish I could have a dark background” is enough to make someone not want to use your app; people are stuck up and stupid like that. So maybe shove your disdain for how she literally HAD to teach you psychology and design principles up your arse, love? If we’d have been so stuck up about what constitutes computer science and what doesn’t, you wouldn’t have a mouse today. Enjoy your ASCII-everything because psychology had a big hand in people understanding how to use a mouse in the first place; coming from a purely software outlook, you’d have thought it couldn’t be done and no one would understand it. Heck, we’d be programming punch cards still, now that’s pure computer science isn’t it.

My point is, I could swear some people had only refrained from dropping out of this course so that they could write some code and show up some usable code to put on their resume with a github link, and I know I’m not talking very nice here, but it’s how I saw it. Yes, our group did an app as well. I genuinely wish we’d done something more involvedly HCI, and uurghh it makes me sad because that’s all the HCI experience I’m gonna get from my entire university career. And it’s done. Over. Blown it. 

I just wish people would stop slagging it off, it’s the most interesting field of computer science. It’s the most relevant in a sense, because you’re going to have to use it anywhere you ever think of a computer. Whether you’ve formally studied it or not, you have used some principle of human computer interaction. If you’ve ever used and understood a feature of anything remotely resembling a computer, you’ve employed HCI yourself.  Used an ATM? HCI! Done up an Excel sheet? HCI! (and a braver person than I… I’ve written a python program once to draw up graphs from data I had because I was too intimidated by Excel). Are you reading this text post on a glass/plastic screen, and have understood that the characters off the screen are meant to be read in plain language the same way you’d read off a paper on a pamphlet or a book? H.C.I.! See? It’s everywhere! 

Anyway, that’s it. It’s just disappointment at a wasted semester and a wasted degree. I wish I knew more about HCI. I really wish I did.

Avatar
Avatar
flakmaniak

So, Microsoft is terrible. Yes yes, the oldest claim in the world.

But specifically… I just hate how Windows 10 tries to conflate and confuse web searches with things on one’s own computer. The start menu should never do anything related to web-searching, especially if it purports to try to give examples of things that are on my hard drive!

This will make old, computer-illiterate people more malware-vulnerable. You have to maintain a strong distinction between “things that are on this computer (and maybe even included in Windows)” (safe, one hopes, or you already got pwned by it, probably), and “things on the web” (scary, dangerous, not to be trusted at all).

Eroding that barrier in the UI is awful. It just FEELS like a violation every time I start typing into the start bar, and it tries to show me ANYTHING web-related. My computer is NOT just an internet-portal! It has tons of stuff on it, and when I’m interacting with the OS, I ONLY want to see things that are already on here!

If I wanted to see something online, I would go to my browser! All the online stuff should be segregated into the browser!

Specific programs can access the internet; that’s fine. But my OS’s functions and interface should JUST be about the things that are already on my computer.

Literally spent multiple hours lobotomizing my Windows reinstall when I upgraded recently, the amount of awful shit they had in nowadays makes me long for the age of win98, when software was merely bad, rather than actively harmful.

Avatar
leylin3

the exact setting in shutup10 for this issue is “Disable extension of Windows 10 search with Bing” it’s at the very bottom under misc.

With hopes that @inthroughthesunroof and @mx-delta-juliette might find this useful especially.

Oh thank GOD, that feature has been driving me batshit for ages

Avatar

Coding isn’t all there is to computer science/software development

So our prof decided to give us feedback on every group’s work for a project we just finished, and roughly speaking, there were three categories: 1. Most inventive, particularly loved this 2. Well thought out, good implementation 3. Ok, good work

Our project was in group “At least you tried”

((((((:

This is for an HCI (human computer interaction) course, the only reason I’m doing this degree at all is because I want to go into HCI and the ONE course I get on that subject in my entire degree, the best we could do is “at least you tried” I want to punch a wall.

The thing is, I knew it was going to be this way? She’d specifically said, try not to do an app because it’s all been done (and I agree). But the thing is, to use like an e-textile/microcontroller circuitboard or anything you can’t just download and upload to a shared folder, you need to be in proximity to each other. During a pandemic? Nuh-uh! We were not only some thousand kilometres away, we also had massive time differences! We weren’t working together, we were a unit dividing up work to do individually.

Second very specific problem, and I’m about to get bitter here so turn away if you’d like but it’s something I see so often in computer science it drives me wild. People will act like coding and software development are the end of the world. They’ll prioritise it so much that anything that’s not building a Java applet is somehow less and pointless. Someone told me they were disappointed that this semester had fewer coding-based projects. Others in our class have been moaning around going “why is an HCI course compulsory for a software major?? ugh, useless course”, well maybe guys, because changing interfaces have ALWAYS been a part of computer science? Just because apps have been around for ten-ish years since Apple’s iPhone sparked off the mobile phone/development trend doesn’t mean it isn’t going to evolve. Just because it’s all you’ve seen growing up doesn’t mean it’s the only thing out there. It doesn’t mean the app format is the end-all and be-all of the computing world.

It’s been done. To death! Also, even for a mobile app (code word for software major porn me thinks, come on now), if your app is badly laid out and confusing to use, no one will use it. Heck, even if it’s useful, something as small and stupid as “I wish I could have a dark background” is enough to make someone not want to use your app; people are stuck up and stupid like that. So maybe shove your disdain for how she literally HAD to teach you psychology and design principles up your arse, love? If we’d have been so stuck up about what constitutes computer science and what doesn’t, you wouldn’t have a mouse today. Enjoy your ASCII-everything because psychology had a big hand in people understanding how to use a mouse in the first place; coming from a purely software outlook, you’d have thought it couldn’t be done and no one would understand it. Heck, we’d be programming punch cards still, now that’s pure computer science isn’t it.

My point is, I could swear some people had only refrained from dropping out of this course so that they could write some code and show up some usable code to put on their resume with a github link, and I know I’m not talking very nice here, but it’s how I saw it. Yes, our group did an app as well. I genuinely wish we’d done something more involvedly HCI, and uurghh it makes me sad because that’s all the HCI experience I’m gonna get from my entire university career. And it’s done. Over. Blown it. 

I just wish people would stop slagging it off, it’s the most interesting field of computer science. It’s the most relevant in a sense, because you’re going to have to use it anywhere you ever think of a computer. Whether you’ve formally studied it or not, you have used some principle of human computer interaction. If you’ve ever used and understood a feature of anything remotely resembling a computer, you’ve employed HCI yourself.  Used an ATM? HCI! Done up an Excel sheet? HCI! (and a braver person than I... I’ve written a python program once to draw up graphs from data I had because I was too intimidated by Excel). Are you reading this text post on a glass/plastic screen, and have understood that the characters off the screen are meant to be read in plain language the same way you’d read off a paper on a pamphlet or a book? H.C.I.! See? It’s everywhere! 

Anyway, that’s it. It’s just disappointment at a wasted semester and a wasted degree. I wish I knew more about HCI. I really wish I did.

You are using an unsupported browser and things might not work as intended. Please make sure you're using the latest version of Chrome, Firefox, Safari, or Edge.
mouthporn.net