开发者

Spark in a Console Application Targetting .NET 4.0

I was just wondering if anyone has successfully got Spark to work in a .NET 4.0 console application for compiling templates to HTML. Unfortunately I am getting the following error:

Unhandled Exception: Spark.Compiler.CompilerException: Dynamic view compilation failed.
(0,0): error CS1703: An assembly with the same identity 'mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' has already been imported. Try removing one of the duplicate references.

However, when I target .NET 3.5 everything works fine, however I specifically wish to target 4.0. Has anyone solved this problem, some old threads on the Spark maili开发者_开发知识库ng list suggest I may just have to edit a line in the source, and recompile, but I hope this is a last resort.

EDIT:

    static void Main(string[] args)
    {
        if (args.Length > 0)
        {
            var templatePath = Path.Combine(Environment.CurrentDirectory, args[0]);
            var templateName = Path.GetFileName(templatePath);
            var templateDirPath = Path.GetDirectoryName(templatePath);
            var viewFolder = new FileSystemViewFolder(templateDirPath);

            var sparkEngine = new SparkViewEngine
            {
                DefaultPageBaseType = typeof(SparkView).FullName,
                ViewFolder = viewFolder.Append(new SubViewFolder(viewFolder, "Shared")),
            };

            var descriptor = new SparkViewDescriptor().AddTemplate(templateName);
            var view = sparkEngine.CreateInstance(descriptor) as SparkView;

            view.Model = args[1];

            using (var writer = new StreamWriter(new FileStream(args[2], FileMode.Create), Encoding.UTF8))
            {
                view.RenderView(writer);
            }
        }
        else
        {
            Console.WriteLine(">>> error - missing arguments:\n\tSparkCompiler.exe [templatepath] [modelstring] [outputname]");
        }
    }


A fix for this has now been added to the main Spark master branch. You can either download the source and compile the latest binaries, or you can also use NuPack/NuGet to add a reference to your solution in VS2010 as the binaries there will be kept up to date from now on.

Hope that helps...


I didn't consider it a last resort. I changed Line #60 of src\Spark\Compiler\BatchCompiler.cs to

var providerOptions = new Dictionary { { "CompilerVersion", "v4.0" } };

it was originally

var providerOptions = new Dictionary { { "CompilerVersion", "v3.5" } };

After a recompile and referencing the new Spark.dll everything worked like a charm. er, um, i was able to proceed to the next exception.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜