Skip to Content Skip to Navigation
Profile image for Derek Martin

Derek Martin

@me@derekmartin.org

I'm Derek. I am a nerd. Principal Program Manager, Azure Core, Global Resiliency. Toots are my opinion and don't represent my company.

311 Posts Posts & Replies 250 Following 547 Followers Search

Well, that was unexpected

@Unixbigot Oooohhh well done!

@spoofy If it means dealing with shitty print drivers less evil, then bless you.

@Chron Cannon fodder for the magats

@thekenyeung @Flipboard @mike Make sure to register with verifiedjournalist.org as folks come on board!

Derek Martin boosted

This is your algorithm. If you don't want to miss posts from someone, turn this on.

Derek Martin boosted

Fun fact: the MP3 file format is as old today as 8-track tapes were when MP3 was invented.

Derek Martin boosted

Why I prefer Star Trek.

Edited 310d ago
Derek Martin boosted
Derek Martin boosted

Six US lawmakers have urged the FTC to take action on the merger of Kroger and Albertsons. Living in the Seattle area, this would effectively mean there was one company running the majority of the grocery stores in the area.

I continue to be perplexed that the FTC has so much focus on novel interpretations of antitrust laws to apply to big tech companies when there are many textbook monopolies harming American consumers which seem to be on the back burner. It’s bonkers.

www.reuters.com/markets/deals/

@marcelias
Shhh you're saying the quiet part out loud!

@TheConversationUS
@histodons@a.gup.pe - the value is that we should never ever stop talking or stop trying

@imcdnzl
What a goofy name. Another marketing department win!

Derek Martin boosted

@slashdot Far too many stupid names of products to keep up.

@Teri_Kanefield @briankrebs To keep trump away from the White House, I’d crawl over broken glass nekkked to vote for her if it comes to that. Switch parties to get to vote for her in the primary? Happily.

@shanselman Cause pew pew boom boom!

@QasimRashid How is it possible that THIS Supreme Court did that?

Derek Martin boosted

Zuckerberg heading into 2024

Derek Martin boosted

One massive @Vivaldi update on , and

It includes things like:

- Full history sync.
- Sessions panel. You can even edit sessions without opening them!
- Workspace rules, for automatically moving tabs into the right workspace
- Append to notes
- Ability to find open tabs on other devices.

Enjoy!

vivaldi.com/blog/vivaldi-on-de

Derek Martin boosted

IFTAS intends to provide guidance, and operate or facilitate services to support electronic service providers (ESPs) who require assistance mitigating Child Sexual Abuse Media (CSAM) on their services.

Motivation

IFTAS serves the independent social media trust and safety community, and is driven in large part by the community Needs Assessment.

Support for CSAM issues is consistently ranked as one of the most requested needs, and as such IFTAS is seeking to mitigate the legal exposure and personal trauma faced by ESPs and content moderators who are tasked with moderating CSAM. 

Regulatory compliance requires ESPs to either actively scan for, or respond to reports of CSAM on their service. Understanding the regulatory requirements is confusing and jurisdictionally complex. Detection solutions can be costly, technically difficult to implement, and pose an additional regulatory burden. Moderating CSAM can be traumatic. The various bodies engaged in child safety are not open to working with thousands of ActivityPub service providers.

IFTAS wishes to:

Promote a healthier, safer Internet;
Reduce the regulatory burden and legal exposure for ESPs;
Minimise harm to content moderators;
Provide or facilitate the use of CSAM classification services while preserving privacy and security to the fullest extent possible;
Reduce duplicative effort;
Serve as a trusted voice for this issue in the open social web.

IFTAS Activities and Services

IFTAS intends to make various resources available, including but not limited to the following:

Content moderator trauma support

Moderators exposed to CSAM via their moderation workflows have expressed the need for post-trauma support. Working with University Middlesex London Centre of Abuse and Trauma Studies, IFTAS is reviewing self-help materials and guidance for trauma mitigation resources, to be made available on the forthcoming IFTAS community library.

Legal and regulatory guidance

While we have published some guidance already (github.com/iftas-org/resources), IFTAS plans to consult with domain experts in relevant jurisdictions to provide guidance for ESPs, updating routinely to ensure accurate, actionable guidance from trustworthy sources.

IFTAS Media classification

Safer is an IFTAS-hosted enterprise deployment that performs hash-matching on images and videos securely transmitted from opted-in services to IFTAS for classification, and creates an automatic report to NCMEC if required. Shield is a hash-matching API from the Canadian Centre for Child Protection that can be called to examine locally-hosted media (images and video) and provide a classification. 3-is is a similar service oriented to EU hosts. IFTAS is exploring methods to facilitate access to these services. 

safer.io/projectarachnid.ca/en/#shield, www.3-is.eu/ 

fedi-safety

fedi-safety is an open source clip interrogation tool that can help classify images. IFTAS is exploring methods to facilitate the use of fedi-safety locally, or as a third-party service.

Known Hashtags

IFTAS plans to provide service administrators with a rolling list of known hashtags in use by sellers and sharers of CSAM, to support local service moderation decisions.

Known Hosts

IFTAS plans to provide service administrators with a rolling list of services seen to host CSAM with no intent or ability to moderate the content, to support local service moderation decisions.

Best Practice

IFTAS is consulting with child safety experts including INHOPE, Arachnid, NCMEC, End Violence Against Children and others to source and share best practices for moderation workflow enhancements to minimise harm for moderators likely to be exposed to CSAM. for example blurring images, using monochrome, and using a dedicated browser profile for this work.

Reference Material

IFTAS Moderator Needs Assessment Report (Q3 2023)
CSAM-CSE (IFTAS Guidance) 
About Child Safety on Federated Social Media – Fediversity – SocialHub

2023-08-04 Special Topic Call – Social Web and CSAM: Liabilities and Tooling
Integrate PhotoDNA to scan for known CSAM · Issue #21027 · mastodon/mastodon · GitHub
github.com/mastodon/mastodon/i
Lemmyshitpost community closed until further notice – Lemmy.World
github.com/LemmyNet/lemmy/issu
Stanford researchers find Mastodon has a massive child abuse material problem – The Verge
Child Safety on Federated Social Media

about.iftas.org/2023/12/13/ift

Edited 314d ago
Derek Martin boosted

He’s making a list,
He’s checking it twice,
He’s going to decide,
Who’s naughty or nice…
Zuckerberg is coming to town.

Edited 314d ago