write.bz

Reader

Read the latest posts from write.bz:

from Citizenry Blog

Nonprofit organizations running online services may have a greater alignment of values and goals with their members compared to for-profit service providers. These organizations are typically mission-driven and focused on serving a specific community or social cause. This mission-driven focus can lead to a greater sense of trust and accountability among members. Additionally, because nonprofits don't have to generate profits for shareholders, they may be able to provide services at lower fees which can make the services more accessible to a wider range of members, particularly those with lower incomes or limited resources.

Nonprofits may also prioritize community engagement and involve members in decision-making processes. This can lead to a greater sense of ownership and engagement among members and a deeper understanding of their needs and the issues they face. Nonprofits have a longer-term focus on serving their members and communities, compared to for-profit service providers who may be more focused on short-term profits. This long-term focus can lead to more targeted and effective services that can have a positive social impact on the communities they serve.

Transparency is also an important aspect of nonprofit organizations. They are typically more transparent in their operations and decision-making processes, which can lead to greater trust and accountability among members. This transparency can also increase the level of trust between the organization and its members, as members can see how their contributions are being used and how decisions are being made.

Lastly, nonprofits have more flexibility in the way they can operate. They can take on more risk than for-profit companies and prioritize their mission over profits. This flexibility can allow nonprofits to be more innovative and adaptable in the way they operate, which can lead to more effective and efficient services. However, it's worth noting that not all nonprofit organizations are created equal, some may not have the same level of transparency, accountability, or resources as others and may not always be able to provide the same level of service as for-profit providers.

 
Read more...

from Citizenry Blog

Decentralization in online technology is needed for several reasons, including increased security and privacy, censorship resistance, and the ability to build more resilient and robust systems. A decentralized system is one in which power and control are distributed among multiple actors, rather than being centralized in a single entity. This decentralization can make it more difficult for hackers to compromise the system as a whole and can make it more difficult for governments or other organizations to censor or control the system.

Another important reason for decentralization in online technology is that it can make systems more resilient and robust. Decentralized systems are less vulnerable to single points of failure, which means that if one part of the system goes down, the entire system does not have to shut down. This is particularly important for critical systems like communication networks and financial systems, which need to be available and reliable at all times. Additionally, decentralized systems can be more difficult to take down or disrupt, as there is no single point of control to target.

 
Read more...

from Citizenry Blog

It is no secret that Google and other web crawlers are indexing Mastodon instances. This creates the issue of now your posts, username, profile and other identifiable information is being added to the indexes. If you host a site including Mastodon you may want to block these and other crawlers from some of all of your site(s).

Check in Mastodon Admin > Preferences > Other Opt-out of search indexing This is a blanket option that adds:

<meta content='noindex' name='robots'>

You may want more fine grained control with options below.

More ways to block the creepy crawlers:

Add to your robots.txt file:
http://www.robotstxt.org/robotstxt.html http://www.robotstxt.org/db.html

Add meta robots tag(s):

<meta name=”robots” content=”index,follow”>

https://searchengineland.com/meta-robots-tag-101-blocking-spiders-cached-pages-more-10665

Add headers to Nginx to block crawlers:

add_header  X-Robots-Tag "noindex, nofollow, nosnippet, noarchive";

Block via your WAF/Firewall: More advanced ways are to use a WAF or even iptables to block strings based on User Agents.

Example:

iptables -A INPUT -p tcp --dport 80 -m string --algo bm --string "Googlebot/2.1 (+http://www.google.com/bot.html)" -j DROP

https://developers.google.com/search/docs/crawling-indexing/overview-google-crawlers

 
Read more...