[{"data":1,"prerenderedAt":418},["ShallowReactive",2],{"release-release-61":3},{"id":4,"title":5,"author":6,"body":7,"category":405,"date":406,"dateFormatted":407,"description":408,"extension":409,"icon":410,"iconColor":410,"imagePath":411,"key":412,"meta":413,"navigation":39,"path":414,"seo":415,"stem":416,"__hash__":417},"releases/releases/release-61.md","Media analysis for your images","capacities",{"type":8,"value":9,"toc":389},"minimark",[10,14,22,25,40,45,52,59,65,72,79,85,88,94,101,107,118,121,127,133,136,142,147,150,153,157,160,163,169,172,176,179,189,195,203,209,217,223,225,229,239,246,251,253,257,260,262,266,277,280,306,314,316,320,337,341],[11,12,13],"p",{},"Remember the new image properties we added in the last release? Now the AI can fill them in for you. ✨",[11,15,16,17,21],{},"Media analysis is available in Beta for all ",[18,19,20],"strong",{},"Capacities Believers"," with AI enabled. It turns your screenshots, documents, and photos into information you can easily search, filter, and work with. 🖼️",[11,23,24],{},"Check out the video release notes!",[26,27,30,31],"div",{"className":28},[29],"video-block","\n    ",[32,33,30],"iframe",{"width":34,"height":35,"src":36,"frameBorder":37,"allow":38,"allowFullScreen":39},1280,720,"https://www.youtube.com/embed/NWP2B3VNsk0?si=8VjgefzhBvdsbLuM","0","accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture",true,[41,42,44],"h2",{"id":43},"analyze-any-image-with-one-click","Analyze any image with one click",[11,46,47,48,51],{},"Open any image in Capacities and click the ",[18,49,50],{},"Analyze"," button. The AI extracts a better title, a short description, detected text, a color palette, and a category.",[11,53,54],{},[55,56],"img",{"alt":57,"src":58},"Image before analysis with the Analyze button","/releases/release-61/before-analysis.png",[11,60,61],{},[55,62],{"alt":63,"src":64},"Analyzed image hero view","/releases/release-61/analyzed-image-hero.png",[11,66,67,68,71],{},"One of the most useful parts is ",[18,69,70],{},"text extraction (OCR)",". It works on screenshots, photos of whiteboards, and even handwritten notes. The AI reads the text and gives it back to you as copyable, searchable content.",[11,73,74,75,78],{},"All extracted text is indexed by ",[18,76,77],{},"full-text search",", so you can find images by the words that appear inside them.",[11,80,81],{},[55,82],{"alt":83,"src":84},"OCR results from a handwritten note","/releases/release-61/handwriting.jpeg",[11,86,87],{},"The extracted text, description, and color palette appear in the analysis tab, so the analysis remains available when you need without getting in your way.",[11,89,90],{},[55,91],{"alt":92,"src":93},"Color palette extracted from an image","/releases/release-61/analysed-image.jpeg",[11,95,96,97,100],{},"The detected ",[18,98,99],{},"color palette"," is shown as visual swatches, and you can click any color to copy its hex value.",[11,102,103],{},[55,104],{"alt":105,"src":106},"Image analysis results with description, extracted text, and color palette","/releases/release-61/analyzed-image.jpeg",[11,108,109,110,113,114,117],{},"All of this data also flows into your image's properties automatically. The ",[18,111,112],{},"Category"," and ",[18,115,116],{},"Colors"," properties are populated by the AI, so you can filter and sort your images by what's actually in them.",[11,119,120],{},"You can then build queries like \"all handwritten notes\" or \"all blue images\" and embed them into moodboards or project pages.",[11,122,123],{},[55,124],{"alt":125,"src":126},"Query for all blue images","/releases/release-61/blue-images.jpeg",[11,128,129],{},[55,130],{"alt":131,"src":132},"Query examples using extracted image metadata","/releases/release-61/queries.png",[11,134,135],{},"The AI assistant can also use this extracted metadata as context when you chat about your media, without needing to re-read the original image file. It might find connections you did not realize you had. 💫",[11,137,138],{},[55,139],{"alt":140,"src":141},"AI chat using extracted image metadata as context","/releases/release-61/image-ai.jpeg",[143,144,146],"h3",{"id":145},"image-analysis-roadmap-️","Image analysis roadmap 🛣️",[11,148,149],{},"Media analysis is just getting started.",[11,151,152],{},"Next, we’re planning optional automatic analysis for new images (with a setting to turn it off), a rollout to all Pro users, and support for more media types beyond images. Stay tuned!",[41,154,156],{"id":155},"aliases-for-tags","Aliases for Tags",[11,158,159],{},"You can now add aliases to your tags.",[11,161,162],{},"This makes it easier to find tags when you use different terms for the same concept, or if you are multi-lingual and you want to refer to the same object in different languages.",[11,164,165],{},[55,166],{"alt":167,"src":168},"Tag aliases in tag settings","/releases/release-61/tag-alias.jpeg",[170,171],"hr",{},[41,173,175],{"id":174},"object-info-sidebar-deprecated","Object info sidebar deprecated",[11,177,178],{},"The object info sidebar has been deprecated. Its functionality has been redistributed:",[180,181,182],"ul",{},[183,184,185,188],"li",{},[18,186,187],{},"Text stats and created-at info"," have moved into the object menu:",[11,190,191],{},[55,192],{"alt":193,"src":194},"Text stats and created at info in the object menu","/releases/release-61/text-stats.png",[180,196,197],{},[183,198,199,202],{},[18,200,201],{},"Document outline"," is now a single-purpose floating element:",[11,204,205],{},[55,206],{"alt":207,"src":208},"Floating document outline","/releases/release-61/contents.png",[180,210,211],{},[183,212,213,216],{},[18,214,215],{},"Objects inside"," can now be found in a dedicated section in the new sidepanel:",[11,218,219],{},[55,220],{"alt":221,"src":222},"Objects inside in the sidepanel","/releases/release-61/objects-inside.png",[170,224],{},[41,226,228],{"id":227},"default-state-for-backlink-sections","Default state for backlink sections",[11,230,231,232,113,235,238],{},"You can now choose whether backlink sections start open or closed on object pages. This applies to ",[18,233,234],{},"Backlinks",[18,236,237],{},"Unlinked Mentions"," (Pro) on the web or desktop apps. You can still override the default per object.",[11,240,241,242,245],{},"There's also a new ",[18,243,244],{},"Backlink content view"," setting that lets you choose the default view for backlink content. Embed is the default when no option is selected.",[11,247,248],{},[55,249],{"alt":228,"src":250},"/releases/release-61/backlink-settings.png",[170,252],{},[41,254,256],{"id":255},"important-storing-issues-in-v1601","Important: Storing Issues in v1.60.1",[11,258,259],{},"There is a critical bug in v1.60.1 that in rare cases prevents some objects from being stored to your device. Please update to v1.60.20 on all devices as soon as the version is available.",[170,261],{},[41,263,265],{"id":264},"community-spotlight-capacities-quick-note-pro-for-android","Community spotlight: Capacities Quick Note Pro for Android",[11,267,268,269,276],{},"Shout-out to developer Daniel Peck who built ",[270,271,275],"a",{"href":272,"rel":273},"https://play.google.com/store/apps/details?id=com.dnnypck.capacitiesquicknotepro&hl=en_US",[274],"nofollow","Capacities Quick Note Pro",", a handy Android app that lets you jot something down on your phone without opening the full app.",[11,278,279],{},"Key features:",[180,281,282,288,294,300],{},[183,283,284,287],{},[18,285,286],{},"Instant capture",": type a note and append it directly to your daily note.",[183,289,290,293],{},[18,291,292],{},"Share from any app",": send text from other apps straight to Capacities.",[183,295,296,299],{},[18,297,298],{},"Homescreen widget",": capture without even opening the app.",[183,301,302,305],{},[18,303,304],{},"Multiple spaces",": switch between your Capacities spaces.",[11,307,308,309,313],{},"The app connects through the official Capacities API, so you'll need a Capacities Pro account to use it. It's available on the ",[270,310,312],{"href":272,"rel":311},[274],"Google Play Store"," for $1.99.",[170,315],{},[143,317,319],{"id":318},"improvements","✨ Improvements",[180,321,322,325,328,331,334],{},[183,323,324],{},"Better triggers for command palette and linking search.",[183,326,327],{},"Better ranking behavior in command palette and linking search.",[183,329,330],{},"Hide Backlinks in main view if they are open in the side panel.",[183,332,333],{},"Unlinked mentions of an object are not limited to 100 elements anymore.",[183,335,336],{},"Better display of the header if you have a single tab only.",[143,338,340],{"id":339},"fixes","🐛 Fixes",[180,342,343,346,355,362,365,372,379,382],{},[183,344,345],{},"Fixed an issue where starting an AI chat in the AI panel then opening the full chat window would clear the prompt.",[183,347,348,349,354],{},"Corrected a misspelled word in the prompt asking users to verify their email. ",[270,350,353],{"href":351,"rel":352},"https://feedback.capacities.io/board/misspelled-word-in-please-verify-email-prompt",[274],"Ticket",".",[183,356,357,358,354],{},"Fixed alias linking so it no longer disables page embedding. ",[270,359,353],{"href":360,"rel":361},"https://feedback.capacities.io/board/alias-disables-page-embedding",[274],[183,363,364],{},"Fixed how tags are shown in embedded task views.",[183,366,367,368,354],{},"Fixed the \"Learn More\" link in the Sharing section which was directing users to an inaccessible page. ",[270,369,353],{"href":370,"rel":371},"https://feedback.capacities.io/board/the-learn-more-link-in-the-sharing-section-is-currently-broken-it-directs-users-to-an-inaccessible-url",[274],[183,373,374,375,354],{},"Fixed a bug where text disappears if a bullet is indented right below an image. ",[270,376,353],{"href":377,"rel":378},"https://feedback.capacities.io/board/bug-text-disappears-if-bullet-is-indented",[274],[183,380,381],{},"Fixed object type colors in create and link menu not reflecting settings.",[183,383,384,385,354],{},"Fixed sending a task to Things not adding a link back to the Capacities content. ",[270,386,353],{"href":387,"rel":388},"https://feedback.capacities.io/board/sending-task-to-things-doesnt-add-link-to-capacities-content",[274],{"title":390,"searchDepth":391,"depth":391,"links":392},"",2,[393,397,398,399,400,401],{"id":43,"depth":391,"text":44,"children":394},[395],{"id":145,"depth":396,"text":146},3,{"id":155,"depth":391,"text":156},{"id":174,"depth":391,"text":175},{"id":227,"depth":391,"text":228},{"id":255,"depth":391,"text":256},{"id":264,"depth":391,"text":265,"children":402},[403,404],{"id":318,"depth":396,"text":319},{"id":339,"depth":396,"text":340},"release","2026-04-01","April 2026","Capacities can now analyze your images with AI, turning them into searchable, richly described objects in your space. ✨","md",null,"/releases/release-61/cover-release-61.jpg","release-61",{},"/releases/release-61",{"title":5,"description":408},"releases/release-61","WmGydgeQVyrRyE7M7-JiAhLW6aI4dJXrKjykjeuxohg",1776250822366]