BRN Discussion Ongoing

I suspect the engineering revenue is regarding companies incorporating AKIDA into their own designer chip. This could be the AKIDA 1000, 1500 or as you mentioned Gen2.
I doubt its for the almost plug in and play off the shelf AKIDA 1000 or 1500 Chip.
So we may well be seeing the lead up to some IP licences? Timing?
Not that it really matters to the context of your post, but AKD1000 and AKD1500 are both AKIDA 1.0 IP (Gen1).
 
  • Like
Reactions: 3 users

TopCat

Regular

IMG_0750.jpeg



IMG_0751.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 28 users
  • Like
  • Wow
Reactions: 12 users
Nope!

Frontgrade Gaisler Licenses Brainchip’s Akida IP to Deploy AI chips into Space

"Under this new commercial Akida IP licence agreement, BrainChip is entitled to receive a 10%
royalty on the Net Sale Price of Frontgrade’s first licensed product in exchange for providing Akida

1.0 IP that includes one hundred hours of integration support and twenty-four months of software
maintenance."
Fair point, how about the 3U VTX Snap Card fr9m BHTech?

Surely we supplied support for this little beauty.

SC
 
  • Like
  • Thinking
Reactions: 6 users

TheDrooben

Pretty Pretty Pretty Pretty Good

Screenshot_20250731_093759_LinkedIn.jpg


Screenshot_20250731_093820_LinkedIn.jpg


Screenshot_20250731_094310_LinkedIn.jpg


Happy as Larry
 
  • Like
  • Fire
  • Love
Reactions: 37 users
The relationship between Megachips and BrainChip, particularly concerning engineering fees and the five-year license agreement, is governed by the terms of their initial partnership and subsequent amendments. As of July 31, 2025, Megachips continues to pay BrainChip for engineering fees related to the development and integration of BrainChip's Akida neuromorphic processor technology. These fees are typically structured to cover ongoing support, customization, and potentially new feature development as Megachips incorporates Akida into its various products and solutions.[1] The initial agreement, signed in 2020, established a framework for this collaboration, including licensing of the Akida IP and associated engineering services.[2]
According to www.iAsk.Ai - Ask AI:
Regarding the five-year license agreement, the initial five-year term for the Akida IP license between BrainChip and Megachips is set to expire in 2025. At the end of this period, several scenarios are possible, and the specific outcome will depend on the terms outlined in their original agreement and any subsequent negotiations. It is highly probable that Megachips will need to either renew the existing license agreement for another term (e.g., another five years) or negotiate a new agreement to continue using BrainChip's Akida technology.
 
  • Like
  • Fire
  • Love
Reactions: 22 users

Wags

Regular
The relationship between Megachips and BrainChip, particularly concerning engineering fees and the five-year license agreement, is governed by the terms of their initial partnership and subsequent amendments. As of July 31, 2025, Megachips continues to pay BrainChip for engineering fees related to the development and integration of BrainChip's Akida neuromorphic processor technology. These fees are typically structured to cover ongoing support, customization, and potentially new feature development as Megachips incorporates Akida into its various products and solutions.[1] The initial agreement, signed in 2020, established a framework for this collaboration, including licensing of the Akida IP and associated engineering services.[2]
According to www.iAsk.Ai - Ask AI:
Regarding the five-year license agreement, the initial five-year term for the Akida IP license between BrainChip and Megachips is set to expire in 2025. At the end of this period, several scenarios are possible, and the specific outcome will depend on the terms outlined in their original agreement and any subsequent negotiations. It is highly probable that Megachips will need to either renew the existing license agreement for another term (e.g., another five years) or negotiate a new agreement to continue using BrainChip's Akida technology.
except it was a 4 year agreement from Nov 21
Megachips License Agreement
 
Last edited:
  • Like
  • Thinking
Reactions: 9 users
So it's still valid until Nov 2025 .
Which means Megachip must still pay engineering fee's were applicable. Who know what they have integrated brn into and what they have not.
Time will tell.
 
Last edited:
  • Like
  • Fire
Reactions: 7 users

itsol4605

Regular
I have a hunch about what Apple is working on and who it's working with.
Apple is pursuing a targeted strategy.

 
  • Like
Reactions: 4 users
Itsol4605.......can I ask what makes you think that from reading the white paper ?.
 

itsol4605

Regular
Itsol4605.......can I ask what makes you think that from reading the white paper ?.
Yes, of course you can ask me.
But let's just wait and see—Apple won't sit idly by.
 
  • Like
Reactions: 3 users

Diogenese

Top 20
I have a hunch about what Apple is working on and who it's working with.
Apple is pursuing a targeted strategy.

When I saw reference to LRM (Large Reasoning Model), I immediately thought of Venn Diagrams, and, of course, someone had been there before me:

https://arxiv.org/abs/2406.05369v1#... knowledge-intensive question-answering tasks.

[Submitted on 8 Jun 2024]

Venn Diagram Prompting : Accelerating Comprehension with Scaffolding Effect​

Sakshi Mahendru, Tejul Pandit

We introduce Venn Diagram (VD) Prompting, an innovative prompting technique which allows Large Language Models (LLMs) to combine and synthesize information across complex, diverse and long-context documents in knowledge-intensive question-answering tasks. Generating answers from multiple documents involves numerous steps to extract relevant and unique information and amalgamate it into a cohesive response. To improve the quality of the final answer, multiple LLM calls or pretrained models are used to perform different tasks such as summarization, reorganization and customization. The approach covered in the paper focuses on replacing the multi-step strategy via a single LLM call using VD prompting. Our proposed technique also aims to eliminate the inherent position bias in the LLMs, enhancing consistency in answers by removing sensitivity to the sequence of input information. It overcomes the challenge of inconsistency traditionally associated with varying input sequences. We also explore the practical applications of the VD prompt based on our examination of the prompt's outcomes. In the experiments performed on four public benchmark question-answering datasets, VD prompting continually matches or surpasses the performance of a meticulously crafted instruction prompt which adheres to optimal guidelines and practices.

The authors are from Palo Alto Networks and are inventors in this patent application:

US2025126155A1 ORGANIZATION-SPECIFIC CYBERSECURITY DIALOGUE SYSTEM WITH LARGE LANGUAGE MODELS 20231016


1753938788911.png





An organization-specific cybersecurity dialogue system (“system”) intelligently generates responses to user utterances for cybersecurity by employing documentation and playbook agents for an associated cybersecurity organization. The system extracts an intent and an intent category from a user utterance, extracts relevant entities from the intent and uses them to determine relevant user telemetry data, and matches the intent, intent category, and user telemetry data to documents of the cybersecurity organization. The system then determines whether to generate a response based on matching document, matching playbook agents, or to prompt a user to create a ticket according to logic applied to the intent, intent category, user telemetry data, and matching documents and generates a corresponding response with large language models.

0019] A response type evaluator 109 applies logic to determine whether to generate a response based on matching documents or based on playbook agents or a response that prompts the user 130 to create a ticket. If the overall ranking 110 indicates at least one document and the user telemetry data 106 does not indicate one or more security appliances associated with the user 130 or with a tenant for the user 130 , the response type evaluator 109 communicates the high ranked (e.g., top-3) documents 126 A to the response generator 111 . The response generator 111 generates abstractive summaries of each of the high ranked documents 126 A and generates one or more prompts to an LLM to generate a response to the user 130 via the user interface 100 . For instance, the generated prompt can prompt the LLM to generate a response that includes the abstractive summaries, hyperlinks to where the high ranked documents 126 A can be accessed, as well as indications of tone and verbosity for the response. The one or more generated prompts were previously engineered by the dialogue system 190 to ensure high quality responses by the LLM. An example response 122 comprises the following: Please read the article at url1 for a tutorial on firewall security policy configuration. The example response 122 can further comprise an abstractive summarization of the content of the article at url1.

The reference to creating a response based on matching documents may have been the seed for the Venn development.
 
  • Like
  • Love
Reactions: 9 users

TopCat

Regular
This is an interview with the CEO of Quadric ( the company on the same web page as Brainchip on Megachips website. Douglas Fairbaine liked this post ). Now I’m not qualified enough to understand this fully BUT is it possible he’s talking about AKD1500? There’s a short video of about 5 minutes on this LinkedIn post if someone who knows stuff could see what they think.

 
  • Like
Reactions: 1 users

TopCat

Regular
Jinsung Choi….Principal Fellow at SoftBank , AI RAN Alliance chair.



🚀The AI-RAN Architecture Revolution
The solution lies in rethinking the entire AI-RAN stack:
🧠 Distributed Intelligence Architecture:
Edge SLMs (1-7B parameters) for real-time RAN decisions
Cloud LLMs for complex network planning and policy creation
Knowledge distillation from teacher models to specialized RAN functions
⚡ Hardware-Software Co-Design:
Processing-in-Memory for reduced data movement
NPU/TPI/Neuromorphic processors for event-driven RAN analytics
Custom ARM/RISC-V extensions for telecom-specific operations
Advanced CPUs with integrated AI accelerators for hybrid workloads
Asynchronous pipelining to eliminate GPU idle time
📡 6G-Native Optimizations:
Prefix caching for repetitive network state queries
Heterogeneous computing matched to RAN workload diversity
Energy-aware scheduling tied to network traffic patterns

*my bold

IMG_0755.jpeg



IMG_0754.jpeg
 
  • Like
  • Wow
  • Fire
Reactions: 12 users
Veleo and not sure if posted before as nearly a year old

 
  • Fire
  • Like
Reactions: 2 users

TECH

Regular
Good evening,

Having read yesterday the company's 4C & Quarterly Report I felt overall it was positive, the Board appears to have done a
thorough evaluation of their earlier statement about moving our entire listing offshore to the US, yes, that's correct, the USA.

And the decision/recommendation was a solid NO.........in my opinion it was/is way too early and that the Chairperson is
part of a team that never ever revolves around one individual and his fickle ideas, once again, that's my private view.

The revenue appeared to represent a 10-fold increase, now no shareholder should assume anything at this point, what we
would like to see is back-back quarters, showing another increase, say 2 million +.... slowly increasing, but continually, quarter
on quarter, remembering that we are still starting from a very low base.

A concern to me was the failure to raise the 20 million AUD, receiving the 8.2 million AUD at an average of 0.2059, surely this
wasn't the agreement, I must be going mad, I thought the agreement was more like 0.50 a share x 40 million shares issued,
and I thought that Ken Scarince confirmed as such at the AGM?......so clearly, I am wrong.

Never-the-less we have approximately 17.3 million USD to 1 August 2025.......plus revenue (if any).

The grind continues.

Goodnight, All......Tech :sleep:
 
  • Like
  • Love
Reactions: 7 users

FuzM

Member
  • Like
Reactions: 3 users
FG is on the job!

 
  • Like
  • Fire
Reactions: 10 users
Top Bottom