• 0 Posts
  • 10 Comments
Joined 1 year ago
cake
Cake day: January 25th, 2024

help-circle
  • Here’s the same Obsidian Canvas document open in Obsidian, and Hi-Canvas: (*just realized the last connection is missing, that was user error while taking the screenshot, disregard)

    They’re not fully cross compatible, but as another user mentioned, the open source spec being worked on is picking up steam as the Open Canvas Working Group (OCWG) and even larger industry canvas platforms are trying to make the format something they can easily import and export in that open format.

    So hopefully you won’t have to worry about migration much longer :)


  • While that’s technically possible, it’s very difficult, and in my opinion, highly unlikely.

    • All notes are stored in markdown, which is compatible with any other markdown-compatible app. It’s not just a note format, it’s a fire exit.
    • Even the canvas files are now having an interoperable format created, with other industry-leading canvas style software, and that whole process was started by the Obsidian team voluntarily
    • All plugins must be open-source unless explicitly and clearly stated, and such plugins are only listed on a case-by-case basis, which makes even additional plugin-specific functionality added to Obsidian easier to port over to other software if Obsidian ever does lock things down
    • They don’t have VC investors, and have mentioned a few times that they won’t be accepting investment in the future, since they don’t exactly have very high costs. They’re explicitly anti “VCware.” Features like Sync that depend on their server hosting bill being paid are only used by paying users, and most users will never have to use Obsidian servers past downloading and updating the app, and installing a few plugins of a few megabytes in size. Costs aren’t likely to rise in any substantial way, and their team is small enough to make it profitable to operate at their existing scale.
    • Actions like this are literally proactively recognizing that something wasn’t in line with their manifesto, and wasn’t beneficial for users, so they’re removing it. Companies planning to enshittify don’t usually remove enshittified/negative features they already have before re-enshittifying. They want you used to the enshittification from the start.



  • True, but I’m of the belief that we’ll probably see a continuation of the existing trend of building and improving upon existing models, rather than always starting entirely from scratch. For instance, you’ll almost always see nearly any newly released model talk about the performance of their Llama version, because it just produces better results when you combine it with the existing quality of Llama.

    I think we’ll see a similar trend now, just with R1 variants instead of Llama variants being the primary new type used. It’s just fundamentally inefficient to start over from scratch every time, so it makes sense that newer iterations would be built directly on previous ones.


  • So are these techiques so novel and breaktrough?

    The general concept, no. (it’s reinforcement learning, something that’s existed for ages)

    The actual implementation, yes. (training a model to think using a separate XML section, reinforcing with the highest quality results from previous iterations using reinforcement learning that naturally pushes responses to the highest rewarded outputs) Most other companies just didn’t assume this would work as well as throwing more data at the problem.

    This is actually how people believe some of OpenAI’s newest models were developed, but the difference is that OpenAI was under the impression that more data would be necessary for the improvements, and thus had to continue training the entire model with additional new information, and they also assumed that directly training in thinking times was the best route, instead of doing so via reinforcement learning. DeepSeek decided to simply scrap that part altogether and go solely for reinforcement learning.

    Will we now have a burst of deepseek like models everywhere?

    Probably, yes. Companies and researchers are already beginning to use this same methodology. Here’s a writeup about S1, a model that performs up to 27% better than OpenAI’s best model. S1 used Supervised Fine Tuning, and did something so basic, that people hadn’t previously thought to try it: Just making the model think longer by modifying terminating XML tags.

    This was released days after R1, based on R1’s initial premise, and creates better quality responses. Oh, and of course, it cost $6 to train.

    So yes, I think it’s highly probable that we see a burst of new models, or at least improvements to existing ones. (Nobody has a very good reason to make a whole new model of a different name/type when they can simply improve the one they’re already using and have implemented)



  • ArchRecord@lemm.eetoTechnology@lemmy.worldBluesky now has 30 million users.
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    18 days ago

    I don’t personally think it’s because of that. Sure, federation as a concept outside of email has a bit of a messaging problem for explaining it to newbies, but… everyone uses email, and knows how that works. This is identical, just with it being posts instead of emails. Users aren’t averse to federation, in concept or practice.

    Bluesky was directly created as a very close clone of Twitter’s UI, co-governed and subsequently pushed by the founder of Twitter himself, who will obviously have more reach than randoms promoting something like Mastodon, and, in my opinion, kind of just had better branding.

    “Bluesky” feels like a breath of fresh air, while “Mastodon” just sounds like… well, a Mastodon, whatever that makes the average person think of at first.

    So when you compare Bluesky, with a familiar UI, nice name, and consistent branding, not to mention algorithms, which Mastodon lacks, all funded by large sums of money, to Mastodon, with unfamiliar branding, minimal funding, and substantially less reach from promoters, which one will win out, regardless of the technology involved?


  • ArchRecord@lemm.eetoTechnology@lemmy.worldBluesky now has 30 million users.
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    1
    ·
    18 days ago

    To anyone bemoaning BlueSky’s lack of federation, check out Free Our Feeds.

    It’s a campaign to create a public interest foundation independent from the Bluesky team (although the Bluesky team has said they support them) that will build independent infrastructure, like a secondary “relay” as an alternative to Bluesky’s that can still communicate across the same protocol (The “AT Protocol”) while also doing developer grants for the development of further social applications built on open protocols like the AT Protocol or ActivityPub.

    They have the support of an existing 501c(3), and their open letter has been signed by people you might find interesting, such as Jimmy Wales (founder of Wikipedia).