So said the guy upon taking a break from his task of designing a “New and Improved” whale harpoon.
Gilles, it seems that your technical abilities and enthusiasm for a great software product (which I absolutely appreciate!) blind you to the perils of Facial Recognition. Surely you’re not okay with rampant and pervasive surveillance, are you? Obviously (or so I assume), digiKam isn’t being designed for this purpose. However, by continuing to develop and advocate Facial Recognition, you are contributing to the “normalisation” of something which only a few years ago was justifiably considered to be vile and evil.
signature.asc (849 bytes) Download Attachment |
J Albrecht: I use Digikam for personal use. I have a bad memory so having facial recognition is an asset for me when I'm looking for that photo from years ago that contains my long lost Aunt Milly. If it weren't for face recognition, it would take many hours of searching, or would just give up. Having said that, our own eyes use face recognition. Are you suggesting we gouge our eyes out so we can't do our own facial recognition? But wait. we might recognise what they sound like, so we might as well cut off our ears. But we also might know what they smell like, so maybe cut off our noses too. I understand that blind people can recognise a person by touch, so we should probably cut off their hands. Technology or the ability to do something isn't evil. Using that technology in a evil way is. Please keep your paranoia out of this public forum. The developers work long and hard hours building something they love. If you can't be constructive, then please keep quiet. On Sat, 2019-02-02 at 12:46 -0500, J Albrecht wrote: So said the guy upon taking a break from his task of designing a “New and Improved” whale harpoon. |
In reply to this post by J Albrecht
On Sat, 2 Feb 2019 12:46:32 -0500
J Albrecht <[hidden email]> wrote: > So said the guy upon taking a break from his task of designing a > “New and Improved” whale harpoon. > > Gilles, it seems that your technical abilities and enthusiasm for a > great software product (which I absolutely appreciate!) blind you > to the perils of Facial Recognition. Surely you’re not okay with > rampant and pervasive surveillance, are you? Obviously (or so I > assume), digiKam isn’t being designed for this purpose. However, by > continuing to develop and advocate Facial Recognition, you are > contributing to the “normalisation” of something which only a few > years ago was justifiably considered to be vile and evil. > In that case what about electricity? All the bad thing about manufacturing of weaponry or all the bad things caused by computers... Should we stop the generation of electricity? > > > On 2 Feb 2019, at 11:48, Gilles Caulier > > <[hidden email]> wrote: > > > > > > > > Le sam. 2 févr. 2019 à 15:52, J Albrecht <[hidden email] > > <mailto:[hidden email]>> a écrit : Have you no shame? > > > > I cringe every time that I read about Facial Recognition > > software. You developers are like a bunch of giddy Manhattan > > Project nerds, gleefully and obliviously participating in the > > acceleration of our collective doom. You people are contributing > > to the development of scary, scary stuff. You’re not alone; > > Consider yourself members of a club which includes Custer Bomb > > Designers and Genome Modifiers. Like you, they see absolutely > > nothing wrong with what they’re doing because after all, they’re > > only building and promoting innocuous technology which will > > benefit “us”. Yeah, right. Tell that to the Uyghers… …and to > > your children. > > > > > > You talking to me ??? > > > > If yes, thanks for the noise and to help the project in a > > constructive way... It's always a pleasure to read this kind of > > words in an open source community. > > > > Gilles Caulier > > > > > -- sknahT vyS |
In reply to this post by Rob D
“But it makes our lives so much more convenient”. Yup. So does clear-cutting for cattle grazing so that we can eat more burgers.
digiKam is a fantastic software package for which I am thankful and use extensively. I truly appreciate all of the hard work and dedication of Gilles and his team. Yet this “focus” on Facial Recognition is short-sighted and will undoubtably be proven in the future to have been misguided. The backlash to Facebook’s blatant attacks on personal privacy is a prime example of the public’s growing outrage as the incipient erosion to liberty continues to become more clear. "Technology or the ability to do something isn't evil. Using that technology in a evil way is.” Sure, just like they always say in the mad US: “Guns don’t kill people, people kill people”… A comment was made in reference to the OpenSource community. Perhaps this community should take a look at the commercial side and see how some of those members actually take a moral stand: https://hypebeast.com/2018/12/google-employees-protest-china-search-engine Or, is this silly because these people are merely “paranoid”?
signature.asc (849 bytes) Download Attachment |
Le sam. 2 févr. 2019 à 22:06, J Albrecht <[hidden email]> a écrit :
True, but in fact no... You forget one point. The data for face recognition still private, located on "your" database from "your" computer. You control all ! The database contents sharing (for ex, when you export to a cloud service) is very limited. With 6.1.0, all pass through an unified dedicated interface shared by showfoto and digiKam, which can be used to share only few basis information. All faces information management methods do not exists here and there is no plan to add this feature. Another point is the file metadata digiKam can set if user setup the right option. Face tags can be registered here. And this point is true. If you share this file in the cloud, webservice will be able to scan face tags in background. This is why the plugin interface is important, and especially the BQM workflow to process files in a queue before to export. In BQM, you have already a magic tool, named RemoveMetadata. If you process files with this tool before to export, you break all webservice scan process to detect face information in files. The goal to include export tool to BQM, is to give to end user all the possibility to increase the "your" privacy with the Internet world. I will will do it as an unified way and automatically. I'm aware about the faces detection and recognition "devil" uses in some country as China, but there is no similar problem here with digiKam. Don't forget that a lots of people want to see a working feature as faces management in digiKam. We have a lots of users coming from Google Picasa who want to see the face detection/recognition working as expected. You are not alone, and developers must try to do the best for both worlds : 1/ provide advanced features. 2/ preserving users privacy everywhere. Voila the plan for the digiKam project policy about the digiKam users privacy. Best Gilles Caulier |
Thank-you for your reply to this sensitive topic, Gilles. Your thoughtfulness is appreciated.
I can see where you’re coming from: You wish to develop something for which there is a clear demand. Yet, there are other dangerous things for which there is also a demand. Take, for example, 3D-printed guns. And, I suppose, recipes for cracked meth. Many people have the ability to develop either but, have made the moral choice not to. Indeed, the digiKam facial recognition data resides on private computers. However, and notwithstanding the “normalisation” of this nefarious technology that further development promotes, you and your team may unintentionally be providing assistance to developers of other software with more sinister motives. As I had alluded to earlier: “Are you okay with telling your kids that you willingly chose to participate in the development of the technology which is used to not only oppresses others in the world but possibly oppress them in the future as well?”
signature.asc (849 bytes) Download Attachment |
Le 05/02/2019 à 18:32, J Albrecht a écrit :
> As I had alluded to earlier: “Are you okay with telling your kids that > you willingly chose to participate in the development of the technology > which is used to not only oppresses others in the world but possibly > oppress them in the future as well?” > blinking eyes wont stop technology "progress". Better learn how it works - or fail - and live with it :-( jdd -- http://dodin.org |
In reply to this post by J Albrecht
On mardi 5 février 2019 18:32:03 CET J Albrecht wrote:
(...) > As I had alluded to earlier: “Are you okay with telling your kids that you > willingly chose to participate in the development of the technology which > is used to not only oppresses others in the world but possibly oppress them > in the future as well?” That cat is well end truly out of the bag now, face recognition is already used in certain airports to speed up passport controls. So developing it for Digikam is not going to suddenly provide orders of magnitudes faster or better face recognition (the more as Digikam devels don't invent the algorithms used, afaik...) While the concerns are valid, attacking a project like Digikam as if they are (out to help) the next batch of totalitarians is a bit over the top. You might consider that the *users* have the responsability of learning their tools and using them correctly, and that they can't hold others responsable for their unwillingness to learn Remco. (P.S. heviiguy is killfiled) |
“killfiled”? How ironically appropriate.
> On 5 Feb 2019, at 14:07, Remco Viëtor <[hidden email]> wrote: > > On mardi 5 février 2019 18:32:03 CET J Albrecht wrote: > (...) >> As I had alluded to earlier: “Are you okay with telling your kids that you >> willingly chose to participate in the development of the technology which >> is used to not only oppresses others in the world but possibly oppress them >> in the future as well?” > > That cat is well end truly out of the bag now, face recognition is already > used in certain airports to speed up passport controls. So developing it for > Digikam is not going to suddenly provide orders of magnitudes faster or better > face recognition (the more as Digikam devels don't invent the algorithms used, > afaik...) > > While the concerns are valid, attacking a project like Digikam as if they are > (out to help) the next batch of totalitarians is a bit over the top. > > You might consider that the *users* have the responsability of learning their > tools and using them correctly, and that they can't hold others responsable > for their unwillingness to learn > > Remco. > > (P.S. heviiguy is killfiled) > > signature.asc (849 bytes) Download Attachment |
Hi, me, as the OP, don't want to see this thread going totally off topic although the risk may rises now due to my statement. In a nutshell your intention are alright right but
your reasoning is more than doubtable. You compared apples and
oranges quite often, I don't think that digiKam is a total to
thread or kill people unlike the Manhattan Project.
Face recognition is a common technology. There is
the UK with CCTV all over the country, Germany begun to some
experiments and is going to expand it. All this is nothing
compared to China. Moreover there are the secret services what
read, classifies and index any kind of date about all people out
there. We have to accept face recognition is and will be even more
part of our life.
In particular, as social media becomes more and more
an essential part of many folks out there.
This is the point where your critics should focus
on.
There is software what does the face tagging for
free but it comes at the cost of privacy (facebook, google) and
there is commercial software but there is also software like
digiKam, what is FOSS. Instead of scare people by
complex/complicated workflow it shall ensure that more and more
are using by simplifying it.
That could prevent people from uploading their pics
to the open web for the pure purpose of face recognition , what
makes them readable by any company or state run service.
Instead of complaining about the face recognition
technology you should start an discussion what happens to the face
tagged pictures as soon as the leave the privacy of an own picture
collection. There should be at least a warning mechanism that
tells that you are going to export privacy data of others to the
open web.
So please if you want to contribute to this
discussion in sustainable way, open a thread about privacy and
picture upload to the web or create wish reports on Bugzilla,
respectively
By the way, I reckon pretty much any digiKam user
uses face recognition to find pictures of friends easily, not
trying to stalk them or do even worse things. If someone really
has the intention to do so, that person has a plenty of options to
do, regardless of digiKam's development...
In this sense, good night and hoping that this
thread find back to its original course.
Stefan
Sent from a
fair mobile
Am 05.02.2019 um 21:03 schrieb J
Albrecht:
“killfiled”? How ironically appropriate.On 5 Feb 2019, at 14:07, Remco Viëtor [hidden email] wrote: On mardi 5 février 2019 18:32:03 CET J Albrecht wrote: (...)As I had alluded to earlier: “Are you okay with telling your kids that you willingly chose to participate in the development of the technology which is used to not only oppresses others in the world but possibly oppress them in the future as well?”That cat is well end truly out of the bag now, face recognition is already used in certain airports to speed up passport controls. So developing it for Digikam is not going to suddenly provide orders of magnitudes faster or better face recognition (the more as Digikam devels don't invent the algorithms used, afaik...) While the concerns are valid, attacking a project like Digikam as if they are (out to help) the next batch of totalitarians is a bit over the top. You might consider that the *users* have the responsability of learning their tools and using them correctly, and that they can't hold others responsable for their unwillingness to learn Remco. (P.S. heviiguy is killfiled) |
Free forum by Nabble | Edit this page |