{"payload":{"feedbackUrl":"https://github.com/orgs/community/discussions/53140","repo":{"id":575295397,"defaultBranch":"main","name":"aiac","ownerLogin":"gofireflyio","currentUserCanPush":false,"isFork":false,"isEmpty":false,"createdAt":"2022-12-07T07:29:50.000Z","ownerAvatar":"https://avatars.githubusercontent.com/u/100200663?v=4","public":true,"private":false,"isOrgOwned":true},"refInfo":{"name":"","listCacheKey":"v0:1716198362.0","currentOid":""},"activityList":{"items":[{"before":"4948563a4d3e794b6a5fb66b9fcf6f8dc8bb3c75","after":"0fc48d4989d9d9d7adb4c3fe625c74664883fdad","ref":"refs/heads/main","pushedAt":"2024-05-20T09:40:27.000Z","pushType":"pr_merge","commitsCount":2,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add new GPT models to OpenAI backend\n\nThe gpt-4-turbo, gpt-4-turbo-2024-04-09, gpt-4o and gpt-4o-2024-05-13\nmodels are now supported by the OpenAI backend.","shortMessageHtmlLink":"Add new GPT models to OpenAI backend"}},{"before":null,"after":"5e7163025f760de74f0e0afe85f91d990884d8b5","ref":"refs/heads/ido-gpt-4o","pushedAt":"2024-05-15T09:35:12.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add new GPT models to OpenAI backend\n\nThe gpt-4-turbo, gpt-4-turbo-2024-04-09, gpt-4o and gpt-4o-2024-05-13\nmodels are now supported by the OpenAI backend.","shortMessageHtmlLink":"Add new GPT models to OpenAI backend"}},{"before":"bfc9a45eff3462da81e798c4b7817b6bdcf8d0cd","after":"4948563a4d3e794b6a5fb66b9fcf6f8dc8bb3c75","ref":"refs/heads/main","pushedAt":"2024-03-20T15:15:28.000Z","pushType":"pr_merge","commitsCount":1,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Added ModelMistral and made it default for OLLAMA (#90)\n\nAdd the Mistral model to the Ollama backend and make it the default,\r\nas it outperforms codellama.","shortMessageHtmlLink":"Added ModelMistral and made it default for OLLAMA (#90)"}},{"before":"fa1644fb61463d408dc6adbcd454d02eafda8f7b","after":null,"ref":"refs/heads/ido-ollama","pushedAt":"2024-02-06T16:20:32.000Z","pushType":"branch_deletion","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"}},{"before":"9d24ddf59848d615d58dd678eeff91274962891e","after":"bfc9a45eff3462da81e798c4b7817b6bdcf8d0cd","ref":"refs/heads/main","pushedAt":"2024-02-06T16:19:38.000Z","pushType":"pr_merge","commitsCount":1,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add support for the Ollama backend\n\nThis commit introduces support for the ollama.ai backend. Ollama is\nan open source LLAMA backend meant for local usage. It supports\nmany different models, many of them related to code generation.\n\nUsage of the ollama backend is available via the `--backend` flag\nor `AIAC_BACKEND` environment variables set to `\"ollama\"`. Ollama\ndoesn't support authentication currently, so the only related flag\nis `--ollama-url` for the API server's URL, but if not used, the\ndefault URL is used (http://localhost:11434/api). With this commit,\n`aiac` will not yet support a scenario where the API server is\nrunning behind an authenticating proxy.\n\nResolves: #77","shortMessageHtmlLink":"Add support for the Ollama backend"}},{"before":"126fc458dd8f62b71c49f41bc8f88fdf80ce0a38","after":"fa1644fb61463d408dc6adbcd454d02eafda8f7b","ref":"refs/heads/ido-ollama","pushedAt":"2024-02-06T14:26:04.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add support for the Ollama backend\n\nThis commit introduces support for the ollama.ai backend. Ollama is\nan open source LLAMA backend meant for local usage. It supports\nmany different models, many of them related to code generation.\n\nUsage of the ollama backend is available via the `--backend` flag\nor `AIAC_BACKEND` environment variables set to `\"ollama\"`. Ollama\ndoesn't support authentication currently, so the only related flag\nis `--ollama-url` for the API server's URL, but if not used, the\ndefault URL is used (http://localhost:11434/api). With this commit,\n`aiac` will not yet support a scenario where the API server is\nrunning behind an authenticating proxy.\n\nResolves: #77","shortMessageHtmlLink":"Add support for the Ollama backend"}},{"before":"043a0d06a1d73ca5339c01515a0d7fdcaac9e0c1","after":"9d24ddf59848d615d58dd678eeff91274962891e","ref":"refs/heads/main","pushedAt":"2024-02-06T14:25:09.000Z","pushType":"pr_merge","commitsCount":1,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Bugfix: segfault when OpenAI API key not provided\n\nIf an OpenAI API key is not provided, and the OpenAI backend is\nused, a segmentation fault occurs. The code that verified the\nexistence of the key was not updated to recent changes in the\nproject. This commit fixes the issue.\n\nResolves: #84","shortMessageHtmlLink":"Bugfix: segfault when OpenAI API key not provided"}},{"before":"1f57b4f3bc7a90f67215b5824e4dfc7f13a0fa4d","after":"126fc458dd8f62b71c49f41bc8f88fdf80ce0a38","ref":"refs/heads/ido-ollama","pushedAt":"2024-02-06T14:23:01.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add support for the Ollama backend\n\nThis commit introduces support for the ollama.ai backend. Ollama is\nan open source LLAMA backend meant for local usage. It supports\nmany different models, many of them related to code generation.\n\nUsage of the ollama backend is available via the `--backend` flag\nor `AIAC_BACKEND` environment variables set to `\"ollama\"`. Ollama\ndoesn't support authentication currently, so the only related flag\nis `--ollama-url` for the API server's URL, but if not used, the\ndefault URL is used (http://localhost:11434/api). With this commit,\n`aiac` will not yet support a scenario where the API server is\nrunning behind an authenticating proxy.\n\nResolves: #77","shortMessageHtmlLink":"Add support for the Ollama backend"}},{"before":null,"after":"1f57b4f3bc7a90f67215b5824e4dfc7f13a0fa4d","ref":"refs/heads/ido-ollama","pushedAt":"2024-02-06T14:16:26.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add support for the Ollama backend\n\nThis commit introduces support for the ollama.ai backend. Ollama is\nan open source LLAMA backend meant for local usage. It supports\nmany different models, many of them related to code generation.\n\nUsage of the ollama backend is available via the `--backend` flag\nor `AIAC_BACKEND` environment variables set to `\"ollama\"`. Ollama\ndoesn't support authentication currently, so the only related flag\nis `--ollama-url` for the API server's URL, but if not used, the\ndefault URL is used (http://localhost:11434/api). With this commit,\n`aiac` will not yet support a scenario where the API server is\nrunning behind an authenticating proxy.\n\nResolves: #77","shortMessageHtmlLink":"Add support for the Ollama backend"}},{"before":null,"after":"7987b47959d2c4bfb7ce4092dd1111752242cff3","ref":"refs/heads/ido-segfault","pushedAt":"2024-02-06T13:08:20.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Bugfix: segfault when OpenAI API key not provided\n\nIf an OpenAI API key is not provided, and the OpenAI backend is\nused, a segmentation fault occurs. The code that verified the\nexistence of the key was not updated to recent changes in the\nproject. This commit fixes the issue.\n\nResolves: #84","shortMessageHtmlLink":"Bugfix: segfault when OpenAI API key not provided"}},{"before":"5b239c5eed9bb1558f43a0c5e68262fa226c2d55","after":"043a0d06a1d73ca5339c01515a0d7fdcaac9e0c1","ref":"refs/heads/main","pushedAt":"2024-01-02T12:45:49.000Z","pushType":"pr_merge","commitsCount":2,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add ability to save and continue chatting\n\nThe aiac prompt line, after showing results from the LLM provider,\nallows saving only once, as it exits immediately after saving. This\ncommit introduces the ability to save and continue chatting, thus\nallowing users to save multiple files during a conversation with\nthe model.\n\nFor backwards compatibility purposes, the \"s\" key will still save\nand exit, but the new \"w\" key can be used to save and continue\nchatting.","shortMessageHtmlLink":"Add ability to save and continue chatting"}},{"before":null,"after":"9097de7e1daeb453255676a3599e36c19c5e26da","ref":"refs/heads/ido-save-and-chat","pushedAt":"2023-12-27T12:35:39.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add ability to save and continue chatting\n\nThe aiac prompt line, after showing results from the LLM provider,\nallows saving only once, as it exits immediately after saving. This\ncommit introduces the ability to save and continue chatting, thus\nallowing users to save multiple files during a conversation with\nthe model.\n\nFor backwards compatibility purposes, the \"s\" key will still save\nand exit, but the new \"w\" key can be used to save and continue\nchatting.","shortMessageHtmlLink":"Add ability to save and continue chatting"}},{"before":"4c6ca4bb2619f7aa248372f15f7d63afeea29ba3","after":"5b239c5eed9bb1558f43a0c5e68262fa226c2d55","ref":"refs/heads/main","pushedAt":"2023-12-26T14:52:21.000Z","pushType":"pr_merge","commitsCount":1,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Update major version in goreleaser.yml","shortMessageHtmlLink":"Update major version in goreleaser.yml"}},{"before":null,"after":"65608ec3977eb6c32048433c68f9a80466fff069","ref":"refs/heads/ido-goreleaser","pushedAt":"2023-12-26T10:38:51.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Update major version in goreleaser.yml","shortMessageHtmlLink":"Update major version in goreleaser.yml"}},{"before":"75a0ab392180a93d098f0bbb896ca807b3e65af1","after":"4c6ca4bb2619f7aa248372f15f7d63afeea29ba3","ref":"refs/heads/main","pushedAt":"2023-12-26T10:35:49.000Z","pushType":"pr_merge","commitsCount":1,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add support for Amazon Bedrock\n\nThis commit introduces support for using Amazon Bedrock as a\nbackend for code generation. In order to achieve this, the client\nstructure in libaiac is now based on an interface, with two\nimplementations. The previous OpenAI implementation is moved to\nlibaiac/openai, and a Bedrock implementation is introduced at\nlibaiac/bedrock.\n\nThe interface, described in the new libaiac/types package, is\nfairly simple, defining a `ListModels`, `DefaultModel`, `Complete`\nand `Chat` methods that must be implemented.\n\nThese changes are backwards compatible from both the perspective\nof a command line usage and library usage, but the major version\nis bumped to 4 due to the significance of the change. This means\nthat users of the library will need to change import statements\nto use \"v4\" instead of \"v3\".","shortMessageHtmlLink":"Add support for Amazon Bedrock"}},{"before":"d9e5326050f06c17822d2f232ae1796516095b31","after":"53a35332e764b7e0eb52d84325a8868b75b4aeba","ref":"refs/heads/ido-bedrock","pushedAt":"2023-12-07T16:29:40.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add support for Amazon Bedrock\n\nThis commit introduces support for using Amazon Bedrock as a\nbackend for code generation. In order to achieve this, the client\nstructure in libaiac is now based on an interface, with two\nimplementations. The previous OpenAI implementation is moved to\nlibaiac/openai, and a Bedrock implementation is introduced at\nlibaiac/bedrock.\n\nThe interface, described in the new libaiac/types package, is\nfairly simple, defining a `ListModels`, `DefaultModel`, `Complete`\nand `Chat` methods that must be implemented.\n\nThese changes are backwards compatible from both the perspective\nof a command line usage and library usage, but the major version\nis bumped to 4 due to the significance of the change. This means\nthat users of the library will need to change import statements\nto use \"v4\" instead of \"v3\".","shortMessageHtmlLink":"Add support for Amazon Bedrock"}},{"before":"93ee42028d4c96437f325f5f93aa3bc79d2547ce","after":"d9e5326050f06c17822d2f232ae1796516095b31","ref":"refs/heads/ido-bedrock","pushedAt":"2023-12-07T13:42:23.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add support for Amazon Bedrock\n\nThis commit introduces support for using Amazon Bedrock as a\nbackend for code generation. In order to achieve this, the client\nstructure in libaiac is now based on an interface, with two\nimplementations. The previous OpenAI implementation is moved to\nlibaiac/openai, and a Bedrock implementation is introduced at\nlibaiac/bedrock.\n\nThe interface, described in the new libaiac/types package, is\nfairly simple, defining a `ListModels`, `DefaultModel`, `Complete`\nand `Chat` methods that must be implemented.\n\nThese changes are backwards compatible from both the perspective\nof a command line usage and library usage, but the major version\nis bumped to 4 due to the significance of the change. This means\nthat users of the library will need to change import statements\nto use \"v4\" instead of \"v3\".","shortMessageHtmlLink":"Add support for Amazon Bedrock"}},{"before":"1d1d796955b43e02a4c3e32af92470768aea1479","after":"93ee42028d4c96437f325f5f93aa3bc79d2547ce","ref":"refs/heads/ido-bedrock","pushedAt":"2023-12-07T13:38:13.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add support for Amazon Bedrock\n\nThis commit introduces support for using Amazon Bedrock as a\nbackend for code generation. In order to achieve this, the client\nstructure in libaiac is now based on an interface, with two\nimplementations. The previous OpenAI implementation is moved to\nlibaiac/openai, and a Bedrock implementation is introduced at\nlibaiac/bedrock.\n\nThe interface, described in the new libaiac/types package, is\nfairly simple, defining a `ListModels`, `DefaultModel`, `Complete`\nand `Chat` methods that must be implemented.\n\nThese changes are backwards compatible from both the perspective\nof a command line usage and library usage, but the major version\nis bumped to 4 due to the significance of the change. This means\nthat users of the library will need to change import statements\nto use \"v4\" instead of \"v3\".","shortMessageHtmlLink":"Add support for Amazon Bedrock"}},{"before":"f492890f88ae61de476c5cf0714dfc4e2ad67562","after":"1d1d796955b43e02a4c3e32af92470768aea1479","ref":"refs/heads/ido-bedrock","pushedAt":"2023-12-07T13:36:36.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add support for Amazon Bedrock\n\nThis commit introduces support for using Amazon Bedrock as a\nbackend for code generation. In order to achieve this, the client\nstructure in libaiac is now based on an interface, with two\nimplementations. The previous OpenAI implementation is moved to\nlibaiac/openai, and a Bedrock implementation is introduced at\nlibaiac/bedrock.\n\nThe interface, described in the new libaiac/types package, is\nfairly simple, defining a `ListModels`, `DefaultModel`, `Complete`\nand `Chat` methods that must be implemented.\n\nThese changes are backwards compatible from both the perspective\nof a command line usage and library usage, but the major version\nis bumped to 4 due to the significance of the change. This means\nthat users of the library will need to change import statements\nto use \"v4\" instead of \"v3\".","shortMessageHtmlLink":"Add support for Amazon Bedrock"}},{"before":"5c3d1e2539f3c4b64e0905927ced0ff5a0ef399b","after":"f492890f88ae61de476c5cf0714dfc4e2ad67562","ref":"refs/heads/ido-bedrock","pushedAt":"2023-12-07T12:50:36.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add support for Amazon Bedrock\n\nThis commit introduces support for using Amazon Bedrock as a\nbackend for code generation. In order to achieve this, the client\nstructure in libaiac is now based on an interface, with two\nimplementations. The previous OpenAI implementation is moved to\nlibaiac/openai, and a Bedrock implementation is introduced at\nlibaiac/bedrock.\n\nThe interface, described in the new libaiac/types package, is\nfairly simple, defining a `ListModels`, `DefaultModel`, `Complete`\nand `Chat` methods that must be implemented.\n\nThese changes are backwards compatible from both the perspective\nof a command line usage and library usage, but the major version\nis bumped to 4 due to the significance of the change. This means\nthat users of the library will need to change import statements\nto use \"v4\" instead of \"v3\".","shortMessageHtmlLink":"Add support for Amazon Bedrock"}},{"before":"00ff9fb7214af5566cdae7057179c6dbc18aa0e9","after":"5c3d1e2539f3c4b64e0905927ced0ff5a0ef399b","ref":"refs/heads/ido-bedrock","pushedAt":"2023-12-07T12:45:56.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add support for Amazon Bedrock","shortMessageHtmlLink":"Add support for Amazon Bedrock"}},{"before":"4e98ae7e3939fac91a748ec6187ae980a481e6ce","after":null,"ref":"refs/heads/firefly-modulecall-64d0ef43bca0e00f0ce157bd","pushedAt":"2023-12-07T06:39:54.000Z","pushType":"branch_deletion","commitsCount":0,"pusher":{"login":"liavyona","name":"Liav Yona","path":"/liavyona","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/31516429?s=80&v=4"}},{"before":"680cc3d6c6653320319797bdf39cedecdc7fc255","after":"00ff9fb7214af5566cdae7057179c6dbc18aa0e9","ref":"refs/heads/ido-bedrock","pushedAt":"2023-12-06T16:43:46.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add support for Amazon Bedrock","shortMessageHtmlLink":"Add support for Amazon Bedrock"}},{"before":"395db805190c68b9d847e1b294e686079c39214a","after":"680cc3d6c6653320319797bdf39cedecdc7fc255","ref":"refs/heads/ido-bedrock","pushedAt":"2023-12-06T16:29:26.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add support for Amazon Bedrock","shortMessageHtmlLink":"Add support for Amazon Bedrock"}},{"before":"c3e697c633d7b0cd4bf05b2ef34c4d43588a2e35","after":"395db805190c68b9d847e1b294e686079c39214a","ref":"refs/heads/ido-bedrock","pushedAt":"2023-12-06T16:07:33.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add support for Amazon Bedrock","shortMessageHtmlLink":"Add support for Amazon Bedrock"}},{"before":"f03344492096167073cf4f0a0cbd9e9e82bb4fd8","after":"c3e697c633d7b0cd4bf05b2ef34c4d43588a2e35","ref":"refs/heads/ido-bedrock","pushedAt":"2023-12-06T15:46:49.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add support for Amazon Bedrock","shortMessageHtmlLink":"Add support for Amazon Bedrock"}},{"before":null,"after":"f03344492096167073cf4f0a0cbd9e9e82bb4fd8","ref":"refs/heads/ido-bedrock","pushedAt":"2023-12-06T15:40:01.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"ido50","name":"Ido Perlmuter","path":"/ido50","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/80107?s=80&v=4"},"commit":{"message":"Add support for Amazon Bedrock","shortMessageHtmlLink":"Add support for Amazon Bedrock"}},{"before":"75a0ab392180a93d098f0bbb896ca807b3e65af1","after":"4e98ae7e3939fac91a748ec6187ae980a481e6ce","ref":"refs/heads/firefly-modulecall-64d0ef43bca0e00f0ce157bd","pushedAt":"2023-08-07T13:19:00.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"radomarina","name":"Rado Marina","path":"/radomarina","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/29256250?s=80&v=4"},"commit":{"message":"Module Call","shortMessageHtmlLink":"Module Call"}},{"before":null,"after":"75a0ab392180a93d098f0bbb896ca807b3e65af1","ref":"refs/heads/firefly-modulecall-64d0ef43bca0e00f0ce157bd","pushedAt":"2023-08-07T13:18:59.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"radomarina","name":"Rado Marina","path":"/radomarina","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/29256250?s=80&v=4"},"commit":{"message":"Add support for azure open ai api adaptations (#59)","shortMessageHtmlLink":"Add support for azure open ai api adaptations (#59)"}},{"before":"71e0d8dd4c71f249c8313b1b3836410e105d6a75","after":null,"ref":"refs/heads/firefly-codification-64b69534e8beed0ea504b38f","pushedAt":"2023-07-18T13:46:10.000Z","pushType":"branch_deletion","commitsCount":0,"pusher":{"login":"davidlevi7","name":"David Levi","path":"/davidlevi7","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/126682806?s=80&v=4"}}],"hasNextPage":true,"hasPreviousPage":false,"activityType":"all","actor":null,"timePeriod":"all","sort":"DESC","perPage":30,"cursor":"djE6ks8AAAAETqETVAA","startCursor":null,"endCursor":null}},"title":"Activity ยท gofireflyio/aiac"}