Replies: 1 comment
-
用来做Agent调用工具解决问题! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
现在,我们有一个强烈的愿望,那就是让InternLM这一卓越的llm得到更为广泛的应用和普及。它在开源llm展现出了巨大的潜力和价值,我们热切期盼更多人能够接触加入InternLM社区。
现在,为了实现这一目标,我们提出一个重要的诉求:血书倡议开发团队考虑增设API调用接口。通过API的支持,用户不仅能够在本地环境中方便地使用InternLM的各项功能,还能将其无缝集成到云端服务、Web应用以及其他各类软件项目中,极大地拓宽其应用场景和适用范围。
如果增加了api调用接口,我们围绕社区能有更好的实现:
综上所述,我们恳切期待InternLM能尽快推出API支持,以满足广大开发者和使用者的需求,共同开创国内更好的开源环境、开源氛围,且一同创造更加接近AGI的未来。
大家可以把用api想实现的功能,打在评论区💌
Now, we have a strong desire to promote the wider application and adoption of InternLM, an outstanding language model. It has demonstrated great potential and value in the open-source community, and we eagerly hope that more people can have access to and join the InternLM community.
To achieve this goal, we propose an important request: the development team of the BloodBook initiative to consider adding API call interfaces. With API support, users will not only be able to conveniently utilize InternLM's functionality in local environments but also seamlessly integrate it into cloud services, web applications, and various software projects, greatly expanding its range of applications and usability.
If an API call interface is implemented, the following benefits can be realized within our community:
Establish an InternLM Discord bot and community, allowing people from around the world to join the community and become active contributors to the open-source movement, while also gaining better understanding of the InternLM community.
Implement an ecosystem of InternLM prompts, as well as direct deployment and execution of various role-playing scenarios and agents, accelerating everyone's grasp of InternLM and understanding its capabilities.
Increase the influence of InternLM. With easy-to-use API calls, users will have further ideas and incentives to explore local deployments.
Accelerate the development of the InternLM application ecosystem. API calls can help many beginners without access to high-performance machines quickly engage with the InternLM model, using tools like Lagent to create their own services and applications, truly reaching millions of users and speeding up the development of AGI in China.
In conclusion, we sincerely look forward to the prompt release of API support for InternLM to meet the needs of developers and users, and jointly create a better open-source environment and community in China, as we work together towards a future closer to AGI.
25 votes ·
Beta Was this translation helpful? Give feedback.
All reactions