OP Grant Application - Reducing gas costs on Optimism and Arbitrum

Introduction
Optimistic rollups like Optimism and Arbitrum are going to take increasing space in the global DEX volumes as shown by the impressive increase on Arbitrum DEX volume these past months.

They now represent over $2.5+B per week, with Paraswap only accounting for $40M over the last week, which is around 1.5%.

I believe that our marketshare could be greatly improved by reducing gas costs for interacting with Paraswap on optimistic rollups. Indeed, with the current gas price of mainnet going crazy like for the meme coin season, it creates an equivalent reaction on these L2s gas prices and transactions on Optimism through Paraswap can reach 5-10$ nowadays.

The more the ecosystem will grow the more cost efficient the protocols will need to be to remain competitive, as a 50% gas saving could durably represent up to 5$ of savings per transaction on Optimism. These economies could attract more users and daily traders or could incentivize more people to develop their scripts with Paraswap.

With these considerations I have been thinking for a few months now on how to drastically reduce the costs for Paraswap on these specific optimistic rollups. It turns out it can be achieved through callData compression. I have now gathered many ideas and tested a few of them and I would like to cease the OP grant opportunity to help me accomplish this project for the DAO.

This compression algorithm will be universal for any callData or swap method used in Paraswap, this will guarantee a sustainable optimisation for all the versions of paraswap to come, without struggling in development for keeping with the optimisations or not to break it.

A saving of +50% of gas cost on Optimism and Arbitrum will bring Paraswap to the top of the aggregators on these chains, and could easily bring 1 or 2% of the relative volumes of Optimism and Arbitrum, thus multiplying by 2 or 3 the volume on this network and potentially the related rewards.

This work will rely on byte level compression algorithms, solidity development, back-end development, smart-contract optimisations, in order to create a brand new compression method optimised for optimistic rollups.

As a tech lead in an innovating startup in blockchain since 2018 I am in position to lead this work.

Project Name

Optimising gas costs on Optimism/Arbitrum

Grant Category

Community project, Paraswap optimisation, Costs savings

Key metrics

The price of a transaction on optimistic rollups mainly dependw on the gas paid for integrating this data in the L1 (aka Ethereum).

A basic swap through uniswap costs 5.4k gas for updating the mainnet, while through Paraswap it costs 10-11k gas.

This difference could be reduced or we could even achieve better gas costs. And Paraswap could have the lowest costs across aggregators on the optimistic rollups.

Reducing the current gas cost for interacting with Paraswap on Optimism will be a clear advantage, as users will see the gas cost reduced by at least 35-50%.

The exact volume acquired is to be determined, but there is a lot of room to grow:

  • current trading volume (7D): $380M (according to defillama)

  • current trading volume on Paraswap (7D): $10M

These volumes are proportionally x10 on Arbitrum.

By reducing the transaction cost, I think a fair x2 on the volumes going through Paraswap is a realistic expectation, even a pessimistic one as it represents only ~2% of the total volume on DEXs on Optimism. In the context of this evaluation it would allow for at least $560M more volume per year, considering a low reward for the DAO on the positive slippage as 0,0035% of the volume that could easily generate more than $20k a year.

The equivalent amount for arbitrum would be + $200k/year as fees for the DAO.

As a total this upgrade could bring around $200k/year as fees for the DAO, as Arbitrum is often currently the second volume across all chains and Paraswap protocol, it could really represent a noticeable increase in rewards and volumes for Paraswap.

If this upgrade was already currently implemented this would represent $116k of saved gas fees for Paraswap users (considering 300k txs on ARB and 140k on OP, with respectively 20cts and 40cts of saving in average).

In total the DAO users and ParaSwap users will benefit from cost saving on Arbitrum and Optimism as well as more ETH rewards while staking $PSP.

Optimism Alignment

This project will help to grow volume on Optimism and Arbitrum by decreasing the gas costs for every OP user interacting through Paraswap. It will also help to decrease the overall gas consumption of optimistic rollups on mainnet, and could help decrease the costs of interacting with both L1 and L2.

ParaSwap Alignment

This project will help to grow volume on Paraswap on Optimism and Arbitrum by increasing the competitive advantages of Paraswap on other DEXs and aggregators.

This project will be structured in 3 phases in collaboration with the Paraswap dev team:

  • research & POC: +1 month including the times of meeting for getting into more close collaboration phases
  • evaluations and optimisations: 1-2 months
  • adaptations of code base and deployment: 1-2 months

Total duration: 3-5 months, expecting delivery date in production in 10/2023 at latest.

I will be leading all the phases and will process to the necessary changes I’ll be able to do, for the deployment part I will help advising and accompanying the Paraswap dev team as much as it will be needed.

Amount of OP requested

Initial 2k $OP on validation of the grant.

  • research & POC: 5k $OP
  • evaluations and optimisations: 7k $OP + 5k $OP bonus(if average saving > 45%)
  • adaptations of code base and deployment: 6k $OP

Total: 20k $OP + 5k $OP performance bonus (around $30k at current $1.4 valuation).

As this project could bring a real advantage to Paraswap, and regarding the complexity of the implementation and the innovative nature of this research for aggregators and DEXs, 20-25k $OP seems a fair share to request if all the success metrics are met. A total duration of 4-5 months will be required to release this upgrade while I’ve been studying the feasibility for a while. This will also cover the costs of deployment and testing as deploying contracts on Optimism and Arbitrum can be quite expensive in the long term.

I’ve hesitated a while to move some of the required $OP amount in a bonus considering the evolutions for each network in volume, # of transactions or number of daily addresses on this network, as it would seem fairer for the DAO to pay only if consequences, but considering the work to setup these comparisons and the uncertainty of the market conditions being the same as the past months, I’ve finally decided not to include this.

Success Metrics

Success is determined for each phase as follow:

Research & POC:

  • Document gathering ideas for optimising callData size on optimism (could stay confidential if necessary to keep competitive advantage)

  • Implementation of a smart-contract allowing for callData decompression

  • Implementation of a script in JavaScript for compressing callData, to give as a input to the decompression method of the smart contract

  • Compression average rate of at least 25% at this stage required

Evaluations and optimisations

  • Create a framework for evaluating the performance and exactitude of the compressing algorithms regarding a variety of inputs, and the related gas costs

  • Optimise the contract to reduce the callData even more

  • Optimisations on the compressing algorithms

  • Compression average rate of at least 35% is expected at this stage and the algorithm should be efficient (running in few ms)

  • If compression average rate is at least 45%, 5k $OP should be unlocked

Adaptations of code base and deployment

  • Testing in real conditions and final adaptations to the contract for production condition

  • Help and guidance around the necessary modifications in the API and blockchain

  • Internal documentation and flows for the compression well designed and proved

  • Implementation of the compressing algorithm for the Paraswap API specific language

  • Support and slight optimisations post-production if possible
  • Gathering data and analytics around the savings
  • Documentation and redaction of communications of the advantages and savings made through this optimisation

When all the points of a phase are considered reached the related grant could be unlocked.

I hope this grant proposal will get your trust and agreement, as I’ve also been accepted as a ParaTrooper recently you can get a bit more of my background here.

I’m determined into achieving my ideas for the DAO and to help developing and structure an innovative protocol.

Cheers

4 Likes

As I said on discord, I find this proposal very interesting and would like to see it develop.
I’m waiting impatiently to see the return of other people but I think that this update of the protocol goes in the right direction and will allow Paraswap to strengthen itself and to be even more attractive.
Nice work :wink:

1 Like

Hi Chab, before going into detail

I have the following questions:

Why didn’t you tell the core team about your work and get recruited by them?

Why optimism and not eth? Why is the proposal for L2s when our current core market is ETH?
It makes sense if you consider that the L2s you mention will be the stars of the next BR, but at the moment it’s ETH that’s leading.

Thank you in advance for your answers.

Hi chab!
Your proposal/application touches on a very important subject, that of ParaSwap’s competitiveness.
Our main objective is to increase the volume transiting through ParaSwap, and competitiveness is the number 1 asset to reach this goal. Whether via rates or gas costs, as we can see, the latter are also taken into account by the “meta aggregators”.
It seems to me that Ethereum is already working on EIP-4844 to improve the gas costs of rollups during validation on L1, but in a more global way.

Your subject is very technical and I think that the intervention of other more knowledgeable members will be necessary to form an educated opinion.

However, a few small non-technical questions:

  • Is this problem of extra tx costs specific to aggregators? Does it affect them all without exception? Have other aggregators already implemented similar solutions?
  • I know it’s a lot to ask, but would you have a simple way of presenting the changes you want to bring to the non-technical CAD audience?

Concerning the more technical part and the impacts on ParaSwap if this grant is voted:

  • We’d need to identify individuals capable of verifying that the various objectives and success metrics have been met before the grant is released.
  • All the code and its impact on ParaSwap would have to be checked beforehand to validate its integration.

A lot depends on your relationship with the DAO members who currently maintain the protocol. Their feedback is crucial:

  • Analyze the validity of the objectives put forward
  • Check modifications to the protocol
  • Follow-up on objectives

In any case, thank you for your message, and more generally for your involvement on the discord.

3 Likes

Hi stikers,

This project is a one time job, it means I can bring this to the protocol but I’m not aware if there is a need for a permanent position, and I’m convinced a permanent one isn’t necessary for this specific optimisation. Moreover the core team idea, with whom I agree, is that the DAO must validate every decisions for financing any evolutions. Additionally it exists a grant for Optimism targeted developments that I was made aware by @Burns and @0xYtocin and it seems the perfect solution of funding for this development.

I understand your point but sadly this optimisation is specific to L2 that are optimistic rollups, for now. As I tried to explain above, in optimistic rollups the main reason for high prices of transactions is the amount of data sent via the call to the contract. This data can be shrinked on ARB/OP enough so that the additional computation cost for decompressing it is lower than the savings made thanks to the compressed call data.
On mainnet sadly the computation cost would overweight a lot the savings made on the costs of the size of the call data.

I hope I made it clearer!

Cheers

4 Likes

Hi @Albist,

Indeed I share the importance of competitiveness for Paraswap, in this wild world that’s what’s gonna differentiate projects in the long term for sure.
EIP-4844 is also a really great news for L2s adoptions in the long term, but it is not really planned before the end of 2023 and should be realistically realeased in early 2024 at best (I can be wrong for sure). Moreover it will reduce a lot the costs for L2s, if there is high usage on them, so the x100 reduction in gas cost on L2s won’t be for tomorrow in my opinion. I guess a x5-10 reduction factor will be more real at the beginning.
Anyway this optimisation will still make Paraswap more competitive than other aggregators on OP/ARB relatively to gas costs for sure, and will probably attract more people to the L2s that could still be interested to save as much gas costs as possible. So it’s a pretty good timing I would say, and a very good thing for L2s.

Is this problem of extra tx costs specific to aggregators? Does it affect them all without exception? Have other aggregators already implemented similar solutions?

Not really, DEXs could have non optimised gas costs too depending on how they designing the data sent to the contract. Uniswap for example is a bit more optimised on L2s because it mainly needs pool ids and amount, but when calling Paraswap you specify much more parameters because you can reach much more exchanges with various specific parameters, that’s also how most of the aggregators work for the contract call point of view. Currently I’m not aware of any optimisations on this kind for aggregators, I thinks it’s more like a MEV secret.

I know it’s a lot to ask, but would you have a simple way of presenting the changes you want to bring to the non-technical CAD audience?

Sure, don’t hesitate I’ll try to simplify then. The bigger the transaction is in data the more it costs on OP/ARB, it’s basically the #1 factor for the price of the transaction. I propose to reduce a lot the size of the transaction by compressing the data, decompressing it in the contract and then sending to the normal Paraswap flow.

We’d need to identify individuals capable of verifying that the various objectives and success metrics have been met before the grant is released.

I agree some people need to be voted as “judges” for validating the steps, mostly the core team regarding their expertise I imagine. I will have to work with them anyway to validate any possible integration.

All the code and its impact on ParaSwap would have to be checked beforehand to validate its integration.

Sure this would be tested on every cases, but in the global idea, this optimisation is not applicable only to Paraswap calls, however the compression algorithm will be optimised for this kind of data structure. So it will be quite easy to write many tests for this algorithm, as we can check the compression/decompression validity for any hexadecimal string. Then the final deployment step will be a decompressing proxy to the current Paraswap contract, and the proxy part doesn’t constitue a big challenge. Moreover this can be reverted very easily, and the previous API could still be used by default in a first time if a parameter “optimisationForL2” isn’t set to true. I’ve done a lot of updates/upgrades/migrations as a tech lead, and I’m really not worried about that one.
The core team will also make sure it’s a solid implementation and testing framework, as I’ll be in close contact with the technical team quite soon to be sure I’m not missing anything in the protocol or my implementation.

A lot depends on your relationship with the DAO members who currently maintain the protocol. Their feedback is crucial:

I agree they will share quite a responsiblity to assess my work in the beginning, but the community will witness the final release of the optimisations anyway, at least. And I plan to update you regularly for any big step made. Moreover I already have a positive feedback on this idea from them.

Thanks for your interest and support, hope I made it clearer.

Cheers

5 Likes

There have been some discussions with the core team for viability and as mentioned above by @Albist and @Bach, this will need a technical sponsor for evaluating each deliverable prior to release of milestone funds. I think the value proposition and PSP/OP aligment is clear so the remaining steps prior to moving to a vote are:

  1. Identify a technical sponsor for each dot point/minor deliverable under success metrics to be agreed by said technical sponsor
  2. Agreement by the implementation team for viability of proposal, timeline and that code will be merged

Without these confirmed, this work (while highly valuable) will provide no benefit to ParaSwap or Optimism.

It is worth carefully reviewing each minor deliverable under the success metrics to ensure that all comprising an individual milestone can be completed in a similar timeframe to avoid delaying paying out that milestone.

Without a grants committee managing this, it will be up to the grantee to coordinate communication by providing updates and giving adequate warning to the reviewer(s) so they can schedule this work in around core duties. Through the process of identifying a technical sponsor, a communication process and required advanced warning time will need to be agreed on. For example - are all minor deliverables within the Milestone to be submitted at once for review to kick off or are some reliant on others within each Milestone?

Great initiative, super detailed proposal and very interesting idea.

Nevertheless, to evaluate the financial part of this grant we need more informations to calculate an hourly rate so we can compare it to industry standards.

As a fullstack developper myself I find the total duration very very long, do you plan on working alone on this?
If not, can you detail how many people will be dedicated to this grant? For how many hours/days and for how many days?

Is it going to be a full time job for you?
If not how many hours/day and how many days/week are you going to dedicate to the grant?

Also core dev team members (ping @Lup) could also help us here to approximately evaluate how many hours this kind of work could take.

4 Likes

@enerow It can appear long but for this range of optimisation it really isn’t. Some teams can spend months on very small optimisations so it is really a difficult development to reach the “most” optimised way. Many iterations are gonna be needed with each ones having specific compressing scripts and decompression contract. The test framework will need to be created and all the iterations will be carefully documented and tested. Frequent meetings with the team will be required, as well as step by steps measures for deployment etc.
So four months isn’t that much. Moreover, I’ve been thinking on this for almost 6 months, finally came with the first ideas a few months ago and while theorizing and testing a bit I must have worked around 80hours on that already.

But to answer more precisely your points:

  • I’ll be working alone on this, I can update front, back-end, smart-contracts and have knowledge in security
  • I’m gonna be working 20hrs a week on this project, for the needed duration. Gonna work 12h spread in the week and 8h per weekend. So the complete project could take around 350-400 hours until production.
  • Regarding the industry standard I have heard many wages around $12k/month for freelancing of competent full stack dev in smart contracts. The total complete duration considering weeks of 35hours is around 2.5 months. Without the bonus it is around $26k, ~$10k/month. The bonus is also pretty common in case of obvious beneficial outputs in very specific areas, like we could reach here. In the end there’s also a bit risk for me if it takes much more time so it could help mitigate this. The split of the reward is also a way to protect the DAO in case I won’t be able to finish the job, but not planning to.

I agree another opinion would be interesting for the DAO to ensure the solidity of this proposal. I think the DAO has never been approached that way for such optimisation.
From a personal point of view, I’m not really willing to sell this development for much cheaper than this because of the taxes resulting on this job.
I hope the DAO can understand that it is a very specific development from scratch, and that is gonna need some important work and headache!

@Burns I agree the team will probably help finding a process soon that will need to be incorporated into the final proposal. At least the responsible and a second responsible will be needed, in order to review each main technical steps and results.

Cheers

4 Likes

Thank you very much for that very detailed answer once again.
All clear to me for now.

2 Likes

Thanks for the proposal which is a great idea and research.
Some questions coming up to me:

  • in your budget part you didn’t take into account the cost of auditing the final contracts update (by code4arena or sherlock for instance). I know it’s not part of your grant but I think it will be good to have the total budget for the DAO.
  • does even in the future with the release of EIP-4844 the gas saving will still worst it?
    I mean if you expect EIP-4844 live next year and your research/work take 3-5 months we could be already in 2024 when it’s live?
1 Like

Thanks @Xutyr

  • Indeed it can be interested to foresee this cost. However it is not currently completely needed, because as I said this new contract will be a proxy. That way there is no more risk to the original protocol that will stay unaltered. The risk might exist though in some “optimization noises” that could reduce the efficiency of the algorithm, but I think I have ways to cover them. Moreover I have experience in pentest so I’ll be able to find most of them. A problem we might have with Code4Arena is that I don’t know if we’ll release the smart contract code from the beginning, I know it’s better for the confidence but it can also be taken by other exchanges, so I might need to find the good copyright here if someone has ideas about that. To answer your point this shouldn’t cost more than $2-3k because the surface attack of the contract would be very little I think.
  • Definitely, it will remain an advantage for sure. IF for example one day the price of Eth reaches 10k and it’s 100gwei on mainnet that would represent like +1$ txs on optimism even with the EIP-4844, so it’s very market dependent I agree, but still an advantage for Paraswap users and also reducing the data stored, limiting the increase of size of both ledgers (L1 and L2).

Cheers

4 Likes

This is a very important and audacious proposal. If this work as proposed, it will be of immense benefit to the protocol, the roll ups and the DAO.

As a non-technical person, I have the understanding of the idea but I just don’t know the intricacies. My major concerns are:

  1. The success of this development rely entirely on you which means it doesn’t matter if we get every other thing right, we’ll need to count on you to deliver.

  2. As we know this proposal is about gaining competitive advantage, timing is key. could you look into the possibility of getting a collaborator for this work to drastically reduce the development time as 3-5 months is a very long time in this industry.

  3. If a success metric of achieving 45% volume increase/reduced cost warrants a bonus, what about inability to deliver or to complete the development. Will that also attract a penalty?

Hey @dseeker thanks for your feedbacks!

  1. Indeed I have an important role in this development, however the grant is divided in many parts so if unfortunately I can’t finalize the developments, the team will still have a fare amount of work to start with. Related to the division it will only be paid a few by the DAO and the work will be able to be resumed by someone else with all the first stones I would have put. I highlight the fact that except a very small compensation at the beginning, all the payments are due post steps. If you think it’s much required to have someone noted in the proposal I can add another user, because I have a close friend I share my work on that with and he’d be able to guide the developments forward.
  2. The timing matters a lot I agree, I’m making good steps on the work but there are uncompressible delays like going to production is a prioritization to see with the core team. I agree it goes fast globally, but there are a few aggregators on these chains for a while and none has implemented it yet, meaning we’re probably not late but in advance on them already.
  3. As mentioned in 1. each step is related to a specific payment, then it is by essence a penalty if unfinished. However the bonus is only on reducing cost ratio, as mentioned in the proposal, volume can be very hard to monitore and evaluate because of market condition, actors evolutions…

Cheers !

3 Likes

Hi Chab, thanks for submitting this proposal.

Making the protocol more efficient is one of our key priorities, and it’s great to see the DAO aligned on this front too. Protocol efficiency is a top priority, and part of the ParaSwap dev team time is assigned to research and develop these ideas.
As mentioned in the discussion, this proposal differs from previous grant requests as it would require close cooperation with the core team to be implemented instead of being an independent venture and might overlap with other ideas currently being developed.
To maximize efficiency, we believe this proposal requires further alignment with what has already been researched and developed. @chab would be great to discuss this further. Please feel free to reply to either of our messages through Telegram or Discord so we can make sure to tackle this as efficiently as possible.

Hi, I’ve been in contact through Telegram and Discord for weeks now.

However there is no “close cooperation” needed, simply I’ll need the team in different moments of this proposal to validate the work and it will ensure the final technical requirements are met and I will almost only need help for deploying into production. Nothing more than this.
Moreover, I stated this optimisation isn’t really protocol dependent, so it will be optimized for this protocol but more optimisations or your side can easily go a top of that.
This is not directly a protocol upgrade because I’m not touching directly the protocol, simply a system of proxy to reduce the gas, optimized for Paraswap calls structure.

When I first approached the team months ago, they told me no chain specific optimisation was being accomplished, thus I’m surprised that this would have overlap with pending optimisations.
Still I understand the importance of synchronization in developments, we have a planned meeting with the team on monday so we should statuate on all this.

Cheers

3 Likes

Hello Chab,

As discussed together, I am reporting here some elements of our conversation and my feedbacks.

In my opinion, it would be more advantageous to integrate as a delegated implementation of Augustus (like simpleSwap, multiSwap…). This approach would enable us to leverage the pre-existing infrastructure, encompassing security checks, tapping into ubiquitous erc20 approvals, protocol legitimacy, larger users networks etc… This indeed implies closer cooperation with the team on key stages at least (onboarding, testing,…).

I confirm we had no plans for calldata compression as a sole purpose though the team is constantly working on improving the contracts design for all the chains in order to reduce the gas overhead and streamline the DEX integration process and more.
This led us to work on a new architecture which will as a result reduce the size of Augustus’ calldata although this is not the main goal (one can see it as a positive side effect).

However I agree to the fact that a somewhat generic compression would still be able to reduce calldata size by even more, although marginally less performant under better conditions. Those better conditions are not here yet and I too believe that the DAO would benefit from any improvement no matter the order.

Finally, in my view, two aspects warrant clarification in order to better understand timelines and costs:
1/ empirical evidence demonstrating how your solution will outperform generic compression/decompression implementations (e.g., Solady’s LibZip)
2/ the payment conditions and the various milestones. Given that we are already in early August, is the timeline still accurate?

Once we have this technical clarity, I believe we are ready to go forward with this, and really look forward to this project !

2 Likes

Huge thanks for the clarification here !
Sorry I was in holidays and then much work to deal with, finally having time to redact this answer.

Definitely, that is the right implementation to go with, even without allowed access I’ll be able to test this and this will not open new vulnerabilities. I would say closer cooperation is a bit more important in the end though.

Glad we agree on this.

1/ I did a few tests to ensure this will generate more optimised call data, and on my current implementation (with still WIP) there is no doubt. For this call data 0x54e3f31b000000000000000000000000000000000000000000000000000000000000002000000000000000000000000082af49447d8a07e3bd95bd0d56f35241523fbab1000000000000000000000000f97f4df75117a78c1a5a0dbb814af92458539fb400000000000000000000000000000000000000000000000000005af3107a4000000000000000000000000000000000000000000000000000006cb51ac3f3fb27000000000000000000000000000000000000000000000000006cecdfd2328bd100000000000000000000000000000000000000000000000000000000000001e00000000000000000000000000000000000000000000000000000000000000220000000000000000000000000000000000000000000000000000000000000034000000000000000000000000000000000000000000000000000000000000003a00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000e4d12a78a24f63f856b7192beaacc9875d387fec010000000000000000000000000000000000000000000000000000000000400000000000000000000000000000000000000000000000000000000000000003e0000000000000000000000000000000000000000000000000000000006480f7a77da2bac8aa204c7f9eb70c5540b31a2e000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001000000000000000000000000b41dd984730daf82f5c41489e21ac79d5e3b61bc00000000000000000000000000000000000000000000000000000000000000e491a32b6900000000000000000000000082af49447d8a07e3bd95bd0d56f35241523fbab100000000000000000000000000000000000000000000000000005af3107a40000000000000000000000000000000000000000000000000000000000000000001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000a00000000000000000000000000000000000000000000000000000000000000001000000000000000000004de47050a8908e2a60899d8788015148241f0993a3fd000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000002000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000e4000000000000000000000000000000000000000000000000000000000000000100000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000

The libraries you quoted has two compressing functions: cdCompress and flzCompress. The initial cost of the call data was 6596 and the two compressing methods above reduce the cost to 4720 and 4608.

My ambition is to reduce the call data cost to more than 45%, ie. below 3627 of gas cost. Making it at least 31% more efficient than these available libs. And I think we could reach even lower, around
2800-3000!

The timeline has drifted a bit for sure, I didn’t continue developing the algorithm during the summer as I needed more confirmations from the DAO that my work will be useful and I’ll be paid! However I already have a good working MVP that I need to structure better but we could consider the first milestone already achieved.
The possible milestone for the completed development of this is still 2-3 months from the proposal validation and the deployment could be established in 3-4 months now that the team agreed from this idea.
Regarding the payment conditions I considered it was mostly based on the achieved compression average rate, because it’s an easy to compute metrics.

I’m happy to have this confirmation from the team, and the discussions we had made me confident this project will be very interesting for the DAO and quite straightforward to put in place in the end.

So if it’s good for everybody here I think we can look forward to make it a proposal!

Cheers

6 Likes

After a small silence due to confirmation with the team, long holidays and hard weeks after, I’m back to dedicate my time on this development!

I’m willing to put this grant application to a vote on Snapshot soon, so I do the last check up here before sending it on Snapshot. I tried to take notice from all your remarks and made it a bit smaller for better readability during the votes for those would didn’t follow the debates here and are less technical.
If the DAO still mostly agrees this will go to Snapshot vote in the next week (around 19th September):

OP grant application - Reducing gas costs on Arbitrum and Optimism

The complete forum discussions can be found here https://gov.paraswap.network/t/op-grant-application-reducing-gas-costs-on-optimism-and-arbitrum/1488 (with more technical details).


Abstract
Optimistic rollups are gaining tractions and trading volumes. Despite being way cheaper (~10x) than Ethereum to execute trades with Paraswap on it, they are still dependent on gas conditions on Ethereum, so the gas price can increase a lot in case of high gas fees on Ethereum. 
This proposal aims to develop a callData compression algorithm that would help reduce the gas cost by ~50% for the trades made on Arbitrum and Optimism in order to create another competitive advantage for Paraswap against other aggregators. 
This reduction would directly contribute to make Paraswap more interesting than its concurrent for small trades and could lead more traders to switch on Paraswap to save important fees over time. This increase in volume will generate sustainable additional revenue to the DAO as these chains are very important chains in the EVM ecosystem.
The feasibility of this proposal has been agreed by the core team across several meetings with for example Mwamedacen and 0xYtocyn.


Goals & review
Optimistic rollup transaction price depends a lot on the size of the callData sent with the transaction. This proposal aims to develop a fast polyvalent off-chain compressing algorithm for callData, a decompressing smart contract and integrate them into Paraswap current architecture with the coordination of the team. It is expected to be delivered and deployed before 2024.
I’m convinced that this development would benefit Paraswap DAO and users by reducing gas costs for any trade, increasing the volume and the revenue on those chains, and contribute to make Paraswap more competitive.
The main goal of this proposal is to allocate 25k $OP tokens from the OP grant reserves of Paraswap DAO, in order to fund the different steps of development related to this optimisation and research. The validity of the implementation will be checked by the core team at each stage, as well as a member of the DAO. 


Means

- Initial 2k $OP on validation of the grant.
- research & POC: 5k $OP
- evaluations and optimisations: 7k $OP + 3k $OP bonus (if average saving > 45%)
- adaptations of code base and deployment: 8k $OP
Total: 22k $OP + 3k $OP performance bonus (around $30k at current $1.4 valuation).

Of these, 500 $OP would be sent to @Xut (who agreed the role) at the end of the two first steps in order to certify the figures have been reached. So 1k $OP is included to bring more confidence to the DAO members that all goals were achieved. 

A fee of 5k $OP would be applied to my rewards if the development isn’t live in the end of February 2024, in order to prove my dedication to achieve this proposal fast, except if the team attests it comes from a difference of prioritisation within their inner resources.

In case the bonus condition or other steps wouldn’t be achieved, the amount for each conditions would be sent back to the DAO OP grant resources.


Implementation Overview

Research & POC:
* Document gathering ideas for optimising callData size on optimism (could stay confidential if necessary to keep competitive advantage)
* Implementation of a smart-contract allowing for callData decompression
* Implementation of a script in JavaScript for compressing callData, to give as a input to the decompression method of the smart contract
* Compression average rate of at least 25% at this stage required
Considering the time that took the discussions and to be sure this proposed proposal will be possible I have quite progressed on that part; and it is possible to deliver this phase in early october.

Evaluations and optimisations
* Create a framework for evaluating the performance and exactitude of the compressing algorithms regarding a variety of inputs, and the related gas costs
* Optimise the contract to reduce the callData even more
* Optimisations on the compressing algorithms
* Compression average rate of at least 35% is expected at this stage and the algorithm should be efficient (running in few ms)
* If compression average rate is at least 45%, 5k $OP should be unlocked
This phase is estimated to be reached in the mid of November.

Adaptations of code base and deployment
* Testing in real conditions and final adaptations to the contract for production condition
* Help and guidance around the necessary modifications in the API and blockchain
* Internal documentation and flows for the compression well designed and proved
* Implementation of the compressing algorithm for the Paraswap API specific language
* Support and slight optimisations post-production if possible
* Gathering data and analytics around the savings
* Documentation and redaction of communications of the advantages and savings made through this optimisation
This phase is estimated to be released in production before the end of 2023.

All the materials of my research would be the property of the team and given entirely at each end of stage at least, in order for the team to continue my work if something really bad happened. This way the DAO could find another person to continue the development or the team could decide to improve it and integrate it later on.
The code source could become open source with a DAO vote but regarding the competitiveness between aggregators, this work will first be maintained private between the core team, Xut and me.


A word
As a member of the DAO for 2 years now, and a Paratrooper for 4 epochs, I would like to contribute more to the thrive and sustainability of this great project, and I’m convinced that it can be a great example for people to research and propose optimisations to the DAO.

Best,
Chab

Please let me know your thoughts.
Chab

5 Likes

Hello mates!

Some following here:

  • I’m organizing the work to finish the first step with a bit of delay. Indeed the vote and this part took a bit more time than expected
  • The results from the implementation look already quite good (>30%) and I only implemented around 50% of the ideas yet
  • This first part milestone is expected to be reached within 10 days maximum. The repository and the documents are gonna be released and shared with the team.

I will inform again on this thread in order to keep everybody who were involved here the most informed as possible.
So the target is to post around every two weeks now!

Cheers,
Chab

5 Likes