开发者

Conditional rendering in "Segments"

So I have the following spark partial

<default extension="null" footer="null" header="null" type="string"/>

<div class="mod ${extension}?{extension != null}">
  <div class="inner">
    <div class="hd ${header}?{header != null}">
      <render segment="hd" />
    </div>

    <div class="bd">
      <render />
    </div>
    <div class="ft ${footer}?{footer != null}">
      <render segment="ft" />
    </div>
  </div> 
</div>

I think that segments are really cool, but I only want it ren开发者_开发百科der if I use it. Maybe something like this:

<default extension="null" footer="null" header="null" type="string"/>

<div class="mod ${extension}?{extension != null}">
  <div class="inner">
    <render segment="hd">
        <div class="hd ${header}?{header != null}">
           <!-- write content here -->
        </div>
    </render>

    <div class="bd">
      <render />
    </div>
   <render segment="ft">
        <div class="ft ${footer}?{footer!= null}">
           <!-- write content here -->
        </div>
    </render>
  </div> 
</div>

Usage like:

<mod>
    <p> My content </p>
    <segment name="hd">
        <h1> My Header </h1>
    </segment>
</mod>

Basically I'm trying to get spark to only render segments that are used. In this instance I wouldn't want the <div class="ft" /> to render and I would want the <h1> wrapped by the <div class="hd">


Does something like this work for you:

<default extension="null" footer="null" header="null" type="string"/>

<div class="mod ${extension}?{extension != null}">
  <div class="inner">
    <div class="hd ${header}" if="header != null">
      <render segment="hd" />
    </div>

    <div class="bd">
      <render />
    </div>
    <div class="ft ${footer}" if="footer != null">
      <render segment="ft" />
    </div>
  </div> 
</div>

This would have the effect of not rendering the divs at all if header or footer was null. You could do the same for extension of course but I assumed you always want to at least render the body so I left it the way you had it. You can put an if="condition" on any node in Spark.

Am I missing something here?

Cheers,
Rob

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜