CEO’s Corner: Silence Is a System — Why Knowing Your Rights Isn’t Enough Anymore

Dear Partners in Progress,

A new kind of silence is spreading, not born of fear, but enforced by systems designed to erase truth. Books are banned. Police misconduct is scrubbed from databases. Medical bias is buried in code. And at the center are real people— questioned, ignored, recorded without consent, denied care, or arrested simply for speaking up.

At Lustitia Aequalis, we don’t accept this silence as usual. We’re building the tools to challenge it. Because justice only works if people can speak it and prove it. Whether police stop you, deny medical care by an algorithm, or silence you in a classroom, what protects you is not just the law… It’s your ability to use it, clearly and on record.

That’s why we’re focused on frontline tools that work in the moment:

🛠️ This Is What Frontline Protection Looks Like

Our work starts where rights are easiest to ignore: in public spaces, in digital systems, in split-second decisions that follow people for years.

You don’t need to argue your worth. You need to be prepared.

You need to know the law, and know what to say when it counts. And you need tools that work without permission.

That’s what we’re building.

That’s who we are.

🔥 Because the Record Is What Stays

We’re calling on educators, legal workers, students, caregivers, and community members to join us. Because in a system that hides its history, your footage, your story, and your clarity are the public record. Let’s protect what power tries to erase.

In Solidarity,

Ashley T. Martin

CEO, Lustitia Aequalis

Table of Contents:

 ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌  ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌  ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌  ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌

CEO’s Corner: Silence Is a System — Why Knowing Your Rights Isn’t Enough Anymore

Dear Partners in Progress,

A new kind of silence is spreading, not born of fear, but enforced by systems designed to erase truth. Books are banned. Police misconduct is scrubbed from databases. Medical bias is buried in code. And at the center are real people— questioned, ignored, recorded without consent, denied care, or arrested simply for speaking up.

At Lustitia Aequalis, we don’t accept this silence as usual. We’re building the tools to challenge it. Because justice only works if people can speak it and prove it. Whether police stop you, deny medical care by an algorithm, or silence you in a classroom, what protects you is not just the law… It’s your ability to use it, clearly and on record.

That’s why we’re focused on frontline tools that work in the moment:

  • Scripts that protect you when police approach

  • Recording rights that hold up under pressure

  • The Witness App, which documents abuse and sends footage straight to the cloud

  • Legal campaigns for misconduct transparency and AI accountability

  • Actionable guidance when censorship and bias show up in schools, clinics, or city streets

🛠️ This Is What Frontline Protection Looks Like

Our work starts where rights are easiest to ignore: in public spaces, in digital systems, in split-second decisions that follow people for years.

You don’t need to argue your worth. You need to be prepared.

You need to know the law, and know what to say when it counts. And you need tools that work without permission.

That’s what we’re building.

That’s who we are.

🔥 Because the Record Is What Stays

We’re calling on educators, legal workers, students, caregivers, and community members to join us. Because in a system that hides its history, your footage, your story, and your clarity are the public record. Let’s protect what power tries to erase.

In Solidarity,

Ashley T. Martin

CEO, Lustitia Aequalis

Table of Contents:

“Am I Being Detained?” Three Sentences That Can Protect You in Police Encounters

By Lustitia Aequalis

When police stop you on the street, at your car, or outside your home, it’s easy to freeze. That’s not just nerves. It’s how pressure works. And officers are trained to use it.

But pressure doesn’t have to strip away your rights. In those moments, your voice can still create a legal boundary. Just three sentences, spoken calmly, can make the difference between walking away and being pulled deeper into the system.

“Am I being detained, or am I free to go?”

“I do not consent to any searches.”

“I want to speak with a lawyer.”

These are not slogans. They are functional tools, recognized by law and sharpened by decades of civil rights defense. Learn them. Practice them. And when the time comes, use them with clarity.

🚔 First: “Am I being detained, or am I free to go?”

This question forces clarity. If you’re not being detained, you have the legal right to walk away. If you are, officers must tell you why.

According to the ACLU of Northern California, California law requires police to state the reason for a stop before questioning you. They must also document the reason in a citation or report. If they cannot give a clear answer, the legality of the stop is questionable.

Use this question to take control early. It sets a boundary and signals that you know the law.

🔍 Second: “I do not consent to any searches.”

Even if you have nothing to hide, never give permission to be searched. Officers may search you anyway, but if you clearly say you do not consent, you protect your ability to challenge that search later in court.

Anything found during a consensual search can be used against you, even if the search wasn’t initially legal. Saying you do not agree to a search creates a record of non-consent. This matters, especially if the case ever reaches a courtroom.

The ACLU guide advises saying this every time an officer tries to search your body, car, bag, or phone. Do not resist physically. Simply repeat the sentence.

🧠 Third: “I want to speak with a lawyer.”

This is your legal shield. Once you say this, officers are required to stop questioning you. If they continue, anything you say may be thrown out in court.

Invoke this right as soon as you are arrested or believe you may be.

Don’t wait for the perfect moment. You do not need to explain why. You do not need to answer anything after. Just say it, and then stay silent.

The ACLU also recommends saying “I want to remain silent” alongside this statement. Together, these two phrases stop the clock on voluntary questioning.


👁️ Legal Rights in Real Time

Police and federal agents often act before they explain. But that does not mean you have no say. Whether the badge says LAPD or ICE, you still have rights. And that includes the right to record.

In Southern California, community members have used smartphones to film ICE detentions at parking lots, car washes, and grocery stores. These videos have become vital evidence, especially in cases where people are wrongly targeted or where force is used.

You are legally allowed to film officers in public spaces, as long as you are not interfering. You can record badge numbers, license plates, and the surrounding area. You can ask detainees for their full name, birth date, and a contact person. You can tell them they have the right to remain silent. And you can ask officers for their name or to show a warrant.

They may not answer. But you have the right to ask.

According to ACLU counsel Peter Eliasberg, if an officer tells you to stop recording, you can respond: “I am exercising my right to document this interaction.” If they tell you to move, comply—but record yourself doing it. That video shows you followed instructions and protected your safety.

🔒 Protect Yourself, Not Just in the Moment

If you record, lock your phone with a passcode. Police may try to take your phone or force you to unlock it. Under the Fifth Amendment, passcodes are protected. Fingerprint or facial ID are not.

Back up footage immediately, and report abuse to legal organizations like the ACLU. But don’t rely on post-event help alone. Resources are limited. What protects you most is your preparedness.

When you know what to say, you reduce confusion. You create a record. And most importantly, you protect yourself and the people around you.

🔒 Record with Purpose. And Protection.

Knowing your rights is step one. Recording them is step two. But in the middle of a high-pressure moment, pulling out your phone isn’t always enough. That’s why Lustitia Aequalis created the Witness App.

With one tap, the Witness App:

  • Starts recording video and audio

  • Sends a real-time alert and location to your emergency contact

  • Uploads footage securely to the cloud so it can’t be deleted, lost, or seized

Whether you’re being stopped, witnessing abuse, or trying to protect someone else, Witness puts legal power in your hands, backed by evidence.

Filming the Police Is Legal—So Why Are People Still Getting Arrested for It?

By Lustitia Aequalis

In the aftermath of George Floyd’s murder, the world saw the truth not because of official footage, but because a teenager held up her phone. Darnella Frazier didn’t have a badge, a press pass, or legal training. She had a camera, and she knew she could use it.

That right—to record public officials in public spaces—is protected under the First Amendment. Yet people across the U.S., especially Black and brown individuals, are still being detained, harassed, and arrested for pulling out their phones.

If you’ve ever hesitated before hitting record, you’re not alone. But knowing your rights can help you protect not just yourself, but others too.

📸 Your Right to Record: What the Law Says

You are legally allowed to record on-duty police officers in public places. This has been upheld by every federal appeals court that has reviewed the issue. That includes public sidewalks, parks, and streets—anywhere you’re lawfully allowed to be.

According to the ACLU, taking photos or video of things plainly visible from public spaces is a constitutional right. That includes recording public officials like police officers.

But officers don’t always follow the law. So it’s critical to understand the limits and how to assert your rights without escalating the situation.

✅ What You Can Do

  • You can record officers performing duties in public as long as you don’t interfere with their work.

  • You can remain silent. You are not obligated to answer questions while recording.

  • You can refuse to hand over your phone or recordings unless the officer has a valid search warrant.

  • You can refuse to unlock your phone. Courts have ruled that passcodes are protected under the Fifth Amendment.

You also do not have to announce that you are recording, though recording openly is usually safer in public settings. Keep your hands visible, remain calm, and keep a safe distance.

⚠️ What You Cannot Do

  • Do not interfere. If an officer is making an arrest or securing a scene, you may be asked to move. You must comply with that order as long as it’s reasonable.

  • Do not resist arrest, even if it’s unlawful. Clearly state that you do not consent to a search and that you are exercising your right to remain silent.

  • Do not assume all officers will respect your rights. Some will still threaten, bluff, or confiscate your phone.

As FIRE explains, your right to record can be overridden only if law enforcement shows a legitimate need—like safety, not personal discomfort. Recording in a public space, even during tense situations, is not a valid reason for police to stop you.

🌐 Recording Rights Beyond the U.S.

The right to film police isn’t just a U.S. issue. Countries like the United Kingdom, Canada, and Germany allow civilians to record law enforcement in public—although limits and enforcement vary.

In some authoritarian regimes, however, simply recording a police officer can lead to detention, surveillance, or worse. That contrast matters. It highlights why the ability to document state power in action is not just a technical right—it’s a public safeguard.

When police departments know they’re being watched by more than body cams and internal reviews, accountability becomes a shared responsibility. And in democracies, that responsibility belongs to the people.

🔐 Protect Your Footage Before It’s Taken

Legal protections mean little if your video disappears.

Officers cannot legally delete your footage or force you to hand it over without a warrant. But that doesn’t stop them from trying. Phones get taken. Recordings get lost. In some cases, footage is wiped on the spot.

The best defense is preparation. Back up your videos. Use strong passwords. Disable facial and fingerprint unlocking. And when possible, record with a tool designed for high-pressure encounters.

📱 When the Stakes Are High, Use the Right Tool

Created by Lustitia Aequalis, Witness is built for the exact moments when knowing your rights might not be enough.

With one tap, it:

  • Starts recording video and audio

  • Sends a live alert and your location to a trusted emergency contact

  • Uploads your footage to the cloud—automatically and securely

  • Allows for anonymous recording and submission, so your footage speaks even when you can’t

Whether you’re filming police misconduct, an immigration raid, or a questionable stop, Witness helps ensure your footage isn’t just captured—it’s protected.

What’s Your Police Score? Why the Fight for Misconduct Transparency Isn’t Over

By Lustitia Aequalis

When a teacher is fired for misconduct, their license can be flagged. When a doctor is under investigation, their name shows up in a searchable registry. But when a police officer abuses their power, resigns, and quietly gets rehired somewhere else, there’s often no record to warn the next department—or the next community.

That was what the National Law Enforcement Accountability Database (NLEAD) aimed to change. Launched in late 2023, NLEAD was the first centralized record of federal officer misconduct, managed by the Department of Justice and used by nearly 90 law enforcement agencies.

But on January 20, 2025, the database was shut down with the stroke of a pen. Its website went dark. Its data went silent. And a year’s worth of work to hold officers accountable vanished, without legal notice or public explanation.

🗂️ What Was NLEAD—and Why It Mattered

The NLEAD database was more than an internal tool. It was a protective measure designed to stop a known problem in law enforcement hiring: wandering officers. These are individuals who were suspended, fired, or forced to resign due to misconduct—then rehired elsewhere with no paper trail to warn the new department.

According to CBS News, NLEAD logged nearly 4,800 misconduct cases between 2018 and 2023. Of those, almost 1,500 officers were suspended, terminated, or resigned under investigation. More than 300 officers were criminally convicted.

The database allowed federal agencies to search an officer’s track record before hiring. It offered a safeguard. And it came with due process protections so officers could challenge false entries. It was built in collaboration with law enforcement leaders and civil rights experts alike.

Even the International Association of Chiefs of Police supported it, saying it gave hiring leaders the context they need to make informed decisions.

🚫 What Happened—and Why It Raises Red Flags

On his first day back in office, President Trump rescinded 78 executive orders signed under Biden—including the one that authorized NLEAD. No new replacement system has been proposed.

As CREW pointed out, this shutdown may have violated federal records law under 44 U.S.C. § 3106, which requires agencies to notify the National Archives before deleting or removing any official federal records. So far, there’s no indication the DOJ followed this legal process.

This decision didn’t just cut off access for hiring managers. It erased a structural response to public outcry after George Floyd, Elijah McClain, and thousands of lesser-known cases that never made headlines but scarred families and communities just the same.

The silence around the shutdown—no press briefing, no replacement, no public explanation—reflects a deeper issue: who gets to control the narrative of police conduct, and who is kept in the dark.

🧭 Why Public Databases Still Matter

Civilian oversight doesn’t work without data. Law enforcement leaders can't make informed hiring decisions if misconduct is buried. And communities can’t demand change if records are kept behind locked doors or erased without notice.

Globally, there is growing movement toward public officer accountability:

  • In Canada, some provinces publish public police discipline decisions online.

  • In the UK, the Independent Office for Police Conduct maintains a searchable record of misconduct outcomes.

  • In New Zealand, complaint investigations by the Independent Police Conduct Authority are available to the public, including summaries of findings.

In the U.S., however, records are still siloed, shielded, or sealed—especially at the federal level. Without shared infrastructure, bad actors can move unchecked. And the public is asked to trust officers who answer to no one but each other.

At Lustitia Aequalis, we believe transparency is not just a policy—it’s a protective structure. That’s why our team is working toward building what we call transparency ecosystems: integrated tools that make misconduct visible, traceable, and usable for those who need it most—families, legal advocates, journalists, community organizers.

We are committed to rebuilding what NLEAD was supposed to become:

  • A system that serves the public, not just departments

  • A record that cannot be quietly deleted

  • A layer of safety that doesn't depend on perfect hiring managers or perfect circumstances

Until then, the tools we do have—like citizen recording and real-time documentation—matter more than ever.

📱 One Record Can Change Everything

You may not have access to a misconduct database. But you can still help create public records in real time. That’s where the Witness App comes in.

Developed by Lustitia Aequalis, the Witness App allows you to:

  • Record video and audio discreetly.

  • Upload your footage instantly and securely to the cloud

  • Notify a trusted emergency contact with your location

  • Use the anonymous upload feature when safety requires distance

In a world where records vanish and power protects itself, your phone can help preserve the truth before it disappears.

Guarding Rights in the Age of AI Surveillance

By Lustitia Aequalis

Walk down the street in New Orleans, and you might pass a camera that knows your name. Not because you told it. Because someone, somewhere, told it to know your face.

In the last two years, the spread of facial recognition and AI-powered surveillance has moved faster than laws can keep up. Police departments across the country are using tools that not only track movement, but claim to predict behavior. These systems don’t just watch—they decide. And for Black, brown, and gender-diverse people, those decisions come with a long shadow.

From private camera networks in Louisiana to data swaps in Milwaukee and social media stings in Memphis, the AI-policing pipeline is here—and it's already being abused.

👁️ Real People. Real Tracking. No Oversight.

In New Orleans, police were quietly using alerts from a network of privately owned facial recognition cameras—not the city’s system, but one run by a nonprofit called Project NOLA. Residents and businesses installed these cameras, then fed the video to an app officers used to get real-time pings when someone flagged on a watchlist showed up on screen.

According to The Marshall Project, this system was used in dozens of arrests—including high-profile incidents that reignited debate over unchecked surveillance.

The problem? It operated outside the oversight the city council mandated. No public notice. No audit trail. No accountability. And it’s not just New Orleans. Departments in San Francisco, Austin, and Milwaukee have sidestepped local limits by outsourcing the surveillance or trading citizen data for tech access.

Meanwhile, police in New York City are experimenting with AI that tracks people based not on face, but on “behavior.” If you move in a way the algorithm deems “irrational,” it can flag you for intervention—no explanation required.

⚖️ Biased Code in Biased Systems

Let’s be clear: this isn’t about neutral technology.

As the ACLU and leading scholars have shown, facial recognition is measurably worse at identifying people of color—especially Black women. Algorithms misidentify Black women’s faces at rates up to 35 percent, while white men are rarely misclassified. The federal government’s own 2019 audit confirmed these disparities.

Add to that the fact that many systems are trained on mugshot databases, which reflect a criminal legal system that already arrests and charges Black and brown people at disproportionate rates. Feed biased data into AI, and it doesn’t get smarter—it gets more efficient at reinforcing harm.

And even if the code were perfect, the system it’s plugged into isn’t. Surveillance has always fallen hardest on communities of color. Whether it's 18th-century lantern laws or 21st-century predictive policing, the message stays the same: your existence is suspicious.

🏙️ Urban Watchlists in Disguise

What’s especially dangerous about today’s tech is how quietly it operates. You don’t see a cop tailing you. You don’t hear a knock at your door. But a camera tags you. A system logs you. And before you know it, you’re part of a digital paper trail that can follow you to your next job, your next stop, or your next protest.

This isn’t science fiction. It’s AI-powered alerts, social media traps, and emotion-detection algorithms—all already in use.

And now, there’s a push in Congress to ban all state-level AI regulation for the next ten years. As Tech Policy Press reports, that moratorium would gut existing protections and block new ones—leaving cities and states powerless to respond as surveillance tech evolves.

Meanwhile, Big Tech keeps expanding its reach, selling facial recognition access in exchange for booking photos, pushing predictive policing software into cash-strapped cities, and fighting regulation under the guise of “innovation.”

But innovation without rights is just exploitation.

🛡️ What Lustitia Aequalis Is Doing

At Lustitia Aequalis, we are actively building protections against tech-driven abuse.

We don’t believe surveillance should be the price of public safety. And we know that if we don’t create the protections ourselves, no one else will.

The Witness App gives you the power to create your own record securely and anonymously.

  • Start recording video and audio with one tap

  • Send your location to a trusted contact in real time

  • Back up footage instantly to the cloud

  • Use anonymous upload if your safety is at risk

Whether you’re walking home, being followed, or witnessing a stop that feels off, Witness helps you document before you’re erased from the story.

Health Equity 2.0: When AI Decides Who Gets Care

By Lustitia Aequalis

In 2019, a widely used hospital algorithm quietly downgraded the care needs of Black patients. Not because they were healthier—but because they had historically spent less on healthcare, and the system had equated lower spending with lower risk. It wasn’t malice. It was code. And it reflected the same truth we've always known: inequality doesn’t disappear in digital systems. It gets embedded deeper and faster.

Now, AI is reshaping how people access care. From scheduling appointments to determining who gets flagged for follow-up, artificial intelligence is becoming the new frontline in medicine. But in the hands of systems already shaped by inequity, AI can deepen gaps instead of closing them—especially if no one is watching how it works, or who it works against.

📲 Where the Law Doesn’t Reach, the Algorithm Still Decides

Let’s break this down:

Most people assume that their health information is protected. In hospitals, it usually is. But once AI enters the picture—through apps, wearable devices, or even text-message reminders—those protections thin out fast.

Here’s what you need to know:

  • Telehealth platforms and apps are not always bound by HIPAA. Many health tech tools collect your medical, behavioral, and personal data under looser privacy terms. If you didn’t read the fine print, your information may already be shared with third parties.

  • AI uses massive health datasets—often without full patient consent. These systems are trained on years of patient data to predict diagnoses, risk levels, and treatment plans. But patients rarely get to opt in—or out. (source)

  • Many algorithms reflect historical bias. A 2019 study in Science found that an algorithm used in U.S. hospitals systematically downplayed Black patients’ needs. Why? It used past healthcare spending as a stand-in for health status. But when past care has been unequal, that logic reinforces discrimination. (source)

And there’s no federal law right now that requires hospitals or health apps to explain how their AI makes decisions—or who might be harmed in the process.

🧠 The Risks of “Smart” Healthcare

AI in medicine isn’t just about convenience. It influences real decisions about real lives. And when things go wrong, it’s not always clear who’s responsible.

Here’s how the legal gaps show up in daily life:

  • Consent gets buried. Many health apps bundle consent for medical use, research, and marketing in a single checkbox. You may not realize you’re giving up rights to your data. (source)

  • AI bias is invisible—but dangerous. From skin cancer diagnostics trained only on white patients, to chatbots that misinterpret symptoms in non-English speakers, skewed data leads to flawed care. (source)

  • Accountability is unclear. If an AI system gives bad advice that harms a patient, who’s liable—the doctor, the developer, or the clinic? U.S. law hasn’t answered that yet. (source)

  • Informed decision-making is harder. AI tools often function like black boxes. Even clinicians don’t always know how an output was generated—or if it can be trusted.

The law hasn’t caught up to the tech. That means people who are already vulnerable—non-English speakers, low-income families, disabled or undocumented individuals—are most at risk of silent harm.

🌍 Equity Isn’t Optional—It’s Global

Other nations are already demanding safeguards.

  • The World Health Organization has called for strict AI oversight, warning that biased data and automation can worsen health disparities. (source)

  • In India, AI tools are transforming preventive care for diseases like tuberculosis—but without strong legal protections for patient privacy, those benefits could come at a cost.

  • In the EU, new laws require healthcare AI systems to be explainable and non-discriminatory by design.

But in the U.S., there’s still no binding standard. Your privacy, your access to care, and your ability to challenge a faulty algorithm depend entirely on what state you’re in—or what platform you're using.

🛡️ Protect Yourself in Tech-Based Care

Until the law catches up, here’s how to stay protected:

  • Ask how your data is used before consenting to any telehealth, app, or wearable device. If they can’t explain it clearly, that’s a red flag.

  • Keep your own record. If a provider gives dismissive advice or denies care, especially in a remote or app-based interaction, write it down—or record it if legally allowed in your state.

  • Push for transparency. You have a right to know whether AI is being used in your care, and how it’s making decisions.

And above all, trust your instincts. If something feels wrong—if you’re treated differently, if your concerns are dismissed, if a decision seems arbitrary—document it. Systems forget. Data hides. But your record can speak.

When AI Makes the Decisions, You Still Have Rights—And Tools to Defend Them

As healthcare moves from hospitals to apps and decisions shift from doctors to algorithms, legal protection can’t be an afterthought. It has to move with you—through every appointment, every form, every quiet moment where bias or negligence slips through the system unseen.

Lustitia Aequalis was built for these moments.

Our real-time justice tech platform helps you:

  • Assert your rights during AI-driven care or telehealth encounters—with built-in legal prompts tailored to healthcare settings

  • Document bias, refusals, or misconduct when providers ignore your language, status, or concerns

  • Report violations through our civil rights clearinghouse, so systemic harm doesn’t stay hidden

  • Learn your rights clearly, with easy-to-access tools and educational resources designed for the real-world complexity of digital healthcare

If a provider misuses your data, if an algorithm denies you care without explanation, if a telehealth visit leaves you dismissed or at risk—you do not have to navigate that alone.

Our platform doesn’t wait for injustice to finish. It’s there while it happens—turning your phone into a frontline tool for legal clarity, protection, and accountability.

Because in a system where tech makes decisions faster than people can question them, your protection should be just as fast and far more human.


Human Rights Watch 2025 Spotlight: Justice Gaps in the U.S.

By Lustitia Aequalis

What Happens When the Law Stops Protecting—and Starts Erasing?

Across U.S. military bases and borderlands, libraries are being gutted, refugee doors are slamming shut, and constitutional rights are quietly stripped from children before they’re old enough to vote. That’s not hyperbole—it’s the 2025 state of justice according to Human Rights Watch.

This year’s report flags the U.S. for ongoing failures in racial justice, censorship in education, migrant protection, and environmental neglect. But beneath every headline is a quieter truth: systemic inequality doesn’t just live in prisons or courtrooms. It lives in what you’re no longer allowed to read, say, or become.

Let’s break down what that means—legally, historically, and practically.

📚 Book Bans in Uniform: The First Amendment on Trial

The Pentagon’s recent orders to purge diversity, equity, and inclusion (DEI) materials from military libraries and K–12 schools have escalated into a full-blown constitutional challenge.

  • Books pulled include I Know Why the Caged Bird Sings, A Queer History of the United States, and even children’s picture books with queer or racial identity themes.

  • Twelve students are now suing Defense Secretary Pete Hegseth, arguing that these bans violate their First Amendment rights to access information and learn freely.

  • These schools—run by the Department of Defense Education Activity (DoDEA)—serve over 67,000 children worldwide and are legally bound to uphold civilian constitutional protections.

While Mein Kampf remains on Naval Academy shelves, books by Black, brown, and queer authors are disappearing. That’s not just irony—it’s targeted censorship with legal consequences.

Read the full lawsuit coverage via The Guardian.

Why It Matters:
When the government dictates which identities are valid enough to learn about, it stops being a question of education and becomes a question of civil liberty. The courts will decide if this is unconstitutional—but for many students, the damage is already done.

🌎 Closed Borders, Broken Systems: Refugee Rights Rolled Back

This year also marks the reversal of decades of humanitarian policy. Through a series of executive orders, the Trump administration has:

  • Suspended all refugee resettlement programs.

  • Ended parole programs for Cubans, Haitians, Nicaraguans, and Venezuelans.

  • Disabled the CBP-One system that allowed people to safely schedule asylum appointments.

  • Revoked Biden-era protections that aimed to humanize and organize asylum processing.

The Legal Context:
Under international law—specifically the 1951 Refugee Convention and its 1967 Protocol, to which the U.S. is party—nations are obligated to provide safe asylum pathways. Unilaterally dismantling these systems may not only violate U.S. precedent, but international human rights law.

The Real-World Impact:
Without safe pathways, vulnerable people are pushed toward traffickers, detention centers, and deadly crossings. This isn’t about political posturing—it’s about life or death for thousands seeking refuge.

Read the full Human Rights Watch report.

⚠️ Censorship Isn’t Just a Policy—It’s a Legal Pattern

What ties these stories together—from book bans to border closures—is a deep erosion of rights that were never equally upheld to begin with.

At the Pentagon’s directive, libraries were told to purge books containing terms like “affirmative action,” “white privilege,” and “critical race theory.” Entire topics—Black history, queer identity, even stories of personal survival—are being deemed incompatible with military values. Read the full report via The Guardian.

We’ve seen this before in history. But what’s different now is how much of it happens quietly, bureaucratically, and without immediate recourse. Unless people push back—legally and publicly—what begins as a memo can become a norm.

🛡️ What You Can Do—And What Tools Can Help

Know this: your rights don’t disappear just because you're on a military base, crossing a border, or challenging a policy.

  • The First Amendment protects your right to read, learn, and protest—even in public schools on federal land.

  • If you're involved in a protest, speak with an attorney first. If you're questioned, ask: “Am I being detained, or am I free to go?”

  • If you’re recording a police or federal encounter, make sure you’re in a public place and not interfering. Some states have audio consent laws—know them, and stay safe.

And when your voice alone might not be enough, your camera can back you up.

Lustitia Aequalis: Standing Up Where Systems Fall Short

At Lustitia Aequalis, we don’t just defend legal rights in the moment. We stand for systemic justice—the kind that protects voices, not just verdicts.

That means calling out legal erosion whether it happens in a jail cell, a school library, or a courtroom. And it means equipping communities to fight back with tools, knowledge, and strategy—not just outrage.

The 2025 Human Rights Watch report makes one thing clear: the gaps in U.S. justice aren’t just oversights. They’re engineered—and expanding. That’s why Lustitia is building tech tools, legal education, and advocacy strategies to protect the people most affected by these systemic patterns: families caught in ICE encounters, students silenced in federal schools, and communities navigating racialized policing and environmental harm.

Because real safety doesn’t start with surveillance—it starts with rights that can’t be erased.

📱 Why the Witness App Matters Now More Than Ever

When systems suppress speech or erase identities, legal protection must become proactive. That’s why Lustitia built the Witness App—to give you real-time power in situations where rights get ignored, dismissed, or violated.

Whether you’re filming a school protest, documenting discriminatory enforcement, or protecting your own peace during a stop, evidence matters.

The Witness App offers:

  • One-tap recording with live alerts to your trusted contacts

  • Automatic cloud backup to protect your footage.

  • Legal documentation features designed with frontline risks in mind

Your phone can protect more than just you.

Download the Witness App.

Use it with purpose. Because systemic justice requires visible truth—and the system counts on silence.

              

New here? Signup

 

Forward To Your Friends

Copyright © . All rights reserved.
Unsubscribe